Read Bill Ministerial Extracts
(1 year, 7 months ago)
Commons ChamberI beg to move, That the Bill be now read a Second time.
Data is already the fuel driving the digital age: it powers the everyday apps that we use, public services are being improved by its better use and businesses rely on it to trade, produce goods and deliver services for their customers. But how we choose to use data going forward will become even more important: it will determine whether we can grow an innovative economy with well-paid, high-skill jobs, it will shape our ability to compete globally in developing the technologies of the future and it will increasingly say something about the nature of our democratic society. The great challenge for democracies, as I see it, will be how to use data to empower rather than control citizens, enhancing their privacy and sense of agency without letting authoritarian states—which, in contrast, use data as a tool to monitor and harvest information from citizens—dominate technological advancement and get a competitive advantage over our companies.
The UK cannot step aside from the debate by simply rubber-stamping whatever iteration of the GDPR comes out of Brussels. We have in our hands a critical opportunity to take a new path and, in doing so, to lead the global conversation about how we can best use data as a force for good—a conversation in which using data more effectively and maintaining high data protection standards are seen not as contradictory but as mutually reinforcing objectives, because trust in this more effective system will build the confidence to share information. We start today not by kicking off a revolution, turning over the apple cart and causing a compliance headache for UK firms, but by beginning an evolution away from an inflexible one-size-fits-all regime and towards one that is risk-based and focused on innovation, flexibility and the needs of our citizens, scientists, public services and companies.
Businesses need data to make better decisions and to reach the right consumers. Researchers need data to discover new treatments. Hospitals need it to deliver more personalised patient care. Our police and security services need data to keep our people safe. Right now, our rules are too vague, too complex and too confusing always to understand. The GDPR is a good standard, but it is not the gold standard. People are struggling to utilise data to innovate, because they are tied up in burdensome activities that are not fundamentally useful in enhancing privacy.
A recently published report on compliance found that 81% of European publishers were unknowingly in breach of the GDPR, despite doing what they thought the law required of them. A YouGov poll from this year found that one in five marketing professionals in the UK report knowing absolutely nothing about the GDPR, despite being bound by it. It is not just businesses: the people whose privacy our laws are supposed to protect do not understand it either. Instead, they click away the thicket of cookie pop-ups just so they can see their screen.
The Bill will maintain the high standards of data protection that British people rightly expect, but it will also help the people who are most affected by data regulation, because we have co-designed it with those people to ensure that our regulation reflects the way in which real people live their lives and run their businesses.
Does the Minister agree that the retention and enhancement of public trust in data is a major issue, that sharing data is a major issue for the public, and that the Government must do more—perhaps she can tell us whether they intend to do more—to educate the public about how and where our data is used, and what powers individuals have to find out this information?
I thank the hon. Lady for her helpful intervention. She is right: as I said earlier, trust in the system is fundamental to whether citizens have the confidence to share their data and whether we can therefore make use of that data. She made a good point about educating people, and I hope that this debate will mark the start of an important public conversation about how people use data. One of the challenges we face is a complex framework which means that people do not even know how to talk about data, and I think that some of the simplifications we wish to introduce will help us to understand one of the fundamental principles to which we want our new regime to adhere.
My hon. Friend gave a long list of people who found the rules we had inherited from outside the UK challenging. She might add to that list Members of Parliament themselves. I am sure I am not alone in having been exasperated by being complained about to the Information Commissioner, in this case by a constituent who had written to me complaining about a local parish council. When I shared his letter with the parish council so that it could show how bogus his long-running complaint had been, he proceeded to file a complaint with the Information Commissioner’s Office because I had shared his phone number—which he had not marked as private—with the parish council, with which he had been in correspondence for several years. The Information Commissioner’s Office took that seriously. This sort of nonsense shows how over-restrictive regulations can be abused by people who are out to stir up trouble unjustifiably.
Let me gently say that if my right hon. Friend’s constituent was going to pick on one Member of Parliament with whom to raise this point, the Member of Parliament who does not, I understand, use emails would be one of the worst candidates. However, I entirely understand Members’ frustration about the current rules. We are looking into what we can do in relation to democratic engagement, because, as my right hon. Friend says, this is one of the areas in which there is not enough clarity about what can and cannot be done.
We want to reduce burdens on businesses, and above all for the small businesses that account for more than 99% of UK firms. I am pleased that the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), is present to back up those proposals. Businesses that do not have the time, the money or the staff to spend precious hours doing unnecessary form-filling are currently being forced to follow some of the same rules as a billion-dollar technology company. We are therefore cutting the amount of pointless paperwork, ensuring that organisations only have to comply with rules on record-keeping and risk assessment when their processing activities are high-risk. We are getting rid of excessively demanding requirements to appoint data protection officers, giving small businesses much more flexibility when it comes to how they manage data protection risks without procuring external resources.
Those changes will not just make the process simpler, clearer and easier for businesses, they will make it cheaper too. We are expecting micro and small businesses to save nearly £90 million in compliance costs every year: that is £90 million more for higher investment, faster growth and better jobs. According to figures published in 2021, data-driven trade already generates 85% of our services exports. Our new international transfers regime clarifies how we can build data bridges to support the close, free and safe exchange of data with other trusted allies.
I am delighted to hear the Secretary of State talk about reducing regulatory burdens without compromising the standards that we are none the less delivering—that is the central distinction, and greatly to be welcomed for its benefits for the entrepreneurialism and fleetness of foot of British industry. Does she agree, however, that while the part of the Bill that deals with open data, or smart data, goes further than that and creates fresh opportunities for, in particular, the small challenger businesses of the kind she has described to take on the big incumbents that own the data lakes in many sectors, those possibilities will be greatly reduced if we take our time and move too slowly? Could it not potentially take 18 months to two years for us to start opening up those other sectors of our economy?
I am delighted, in turn, to hear my hon. Friend call me the Secretary of State—I am grateful for the promotion, even if it is not a reality. I know how passionate he feels about open data, which is a subject we have discussed before. As I said earlier, I am pleased that the Under-Secretary of State for Business and Trade is present, because this morning he announced that a new council will be driving forward this work. As my hon. Friend knows, this is not necessarily about legislation being in place—I think the Bill gives him what he wants—but about that sense of momentum, and about onboarding new sectors into this regime and not being slow in doing so. As he says, a great deal of economic benefit can be gained from this, and we do not want it to be delayed any further.
Let me first draw attention to my entry in the Register of Members’ Financial Interests. Let me also apologise for missing the Minister’s opening remarks—I was taken by surprise by the shortness of the preceding statement and had to rush to the Chamber.
May I take the Minister back to the subject of compliance costs? I understand that the projected simplification will result in a reduction in those costs, but does she acknowledge that a new regime, or changes to the current regime, will kick off an enormous retraining exercise for businesses, many of which have already been through that process recently and reached a settled state of understanding of how they should be managing data? Even a modest amount of tinkering instils a sense among British businesses, particularly small businesses, that they must put everyone back through the system, at enormous cost. Unless the Minister is very careful and very clear about the changes being made, she will create a whole new industry for the next two or three years, as every data controller in a small business—often doing this part time alongside their main job—has to be retrained.
We have been very cognisant of that risk in developing our proposals. As I said in my opening remarks, we do not wish to upset the apple cart and create a compliance headache for businesses, which would be entirely contrary to the aims of the Bill. A small business that is currently compliant with the GDPR will continue to be compliant under the new regime. However, we want to give businesses flexibility in regard to how they deliver that compliance, so that, for instance, they do not have to employ a data protection officer.
I am grateful to the Minister for being so generous with her time. May I ask whether the Government intend to maintain data adequacy with the EU? I only ask because I have been contacted by some business owners who are concerned about the possible loss of EU data adequacy and the cost that might be levied on them as a result.
I thank the hon. Gentleman for pressing me on that important point. I know that many businesses are seeking to maintain adequacy. If we want a business-friendly regime, we do not want to create regulatory disruption for businesses, particularly those that trade with Europe and want to ensure that there is a free flow of data. I can reassure him that we have been in constant contact with the European Commission about our proposals. We want to make sure that there are no surprises. We are currently adequate, and we believe that we will maintain adequacy following the enactment of the Bill.
I was concerned to hear from the British Medical Association that if the EU were to conclude that data protection legislation in the UK was inadequate, that would present a significant problem for organisations conducting medical research in the UK. Given that so many amazing medical researchers across the UK currently work in collaboration with EU counterparts, can the Minister assure the House that the Bill will not represent an inadequacy in comparison with EU legislation as it stands?
I hope that my previous reply reassured the hon. Lady that we intend to maintain adequacy, and we do not consider that the Bill will present a risk in that regard. What we are trying to do, particularly in respect of medical research, is make it easier for scientists to innovate and conduct that research without constantly having to return for consent when it is apparent that consent has already been granted for particular medical data processing activities. We think that will help us to maintain our world-leading position as a scientific research powerhouse.
Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliance mechanisms that they already use, avoiding needless checks and costs. We are also delighted to be co-hosting, in partnership with the United States, the next workshop of the global cross-border privacy rules forum in London this week. The CBPR system is one of the few existing operational mechanisms that, by design, aims to facilitate data flows on a global scale.
World-class research requires world-class data, but right now many scientists are reluctant to get the data they need to get on with their research, for the simple reason that they do not know how research is defined. They can also be stopped in their tracks if they try to broaden their research or follow a new and potentially interesting avenue. When that happens, they can be required to go back and seek permission all over again, even though they have already gained that permission earlier to use personal data. We do not think that makes sense. The pandemic showed that we cannot risk delaying discoveries that could save lives. Nothing should be holding us back from curing cancer, tackling disease or producing new drugs and treatments. This Bill will simplify the legal requirements around research so that scientists can work to their strengths with legal clarity on what they can and cannot do.
The Bill will also ensure that people benefit from the results of research by unlocking the potential of transformative technologies. Taking artificial intelligence as an example, we have recently published our White Paper: “AI regulation: a pro-innovation approach”. In the meantime, the Bill will ensure that organisations know when they can use responsible automated decision making and that people know when they can request human intervention where those decisions impact their lives, whether that means getting a fair price for the insurance they receive after an accident or a fair chance of getting the job they have always wanted.
I spoke earlier about the currency of trust and how, by maintaining it through high data protection standards, we are likely to see more data sharing, not less. Fundamental to that trust will be confidence in the robustness of the regulator. We already have a world-leading independent regulator in the Information Commissioner’s Office, but the ICO needs to adapt to reflect the greater role that data now plays in our lives alongside its strategic importance to our economic competitiveness. The ICO was set up in the 1980s for a completely different world, and the pace, volume and power of the data we use today has changed dramatically since then.
It is only right that we give the regulator the tools it needs to keep pace and to keep our personal data safe while ensuring that, as an organisation, it remains accountable, flexible and fit for the modern world. The Bill will modernise the structure and objectives of the ICO. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also be asked to focus on how it can empower businesses and organisations to drive growth and innovation across the UK, and support public trust and confidence in the use of personal data.
The Bill is also important for consumers, helping them to share less data while getting more product. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking tools offered by innovative businesses, which help consumers and businesses to manage their finances and spending, track their carbon footprint and access credit.
The Minister always delivers a very solid message and we all appreciate that. In relation to the high data protection standards that she is outlining, there is also a balance to be achieved when it comes to ensuring that there are no unnecessary barriers for individuals and businesses. Can she assure the House that that will be exactly what happens?
I am always happy to take an intervention from the hon. Member. I want to assure him that we are building high data protection standards that are built on the fundamental principles of the GDPR, and we are trying to get the right balance between high data protection standards that will protect the consumer and giving businesses the flexibility they need. I will continue this conversation with him as the Bill passes through the House.
I thank the Minster for being so generous with her time. With regard to the independent commissioner, the regulator, who will set the terms of reference? Will it be genuinely independent? It seems to me that a lot of power will fall on the shoulders of the Secretary of State, whoever that might be in the not-too-distant future.
The Secretary of State will have greater powers when it comes to some of the statutory codes that the ICO adheres to, but those powers will be brought to this House for its consent. The whole idea is to make the ICO much more democratically accountable. I know that concern about the independence of the regulator has been raised as we have been working up these proposals, but I wish to assure the House that we do not believe those concerns to be justified or legitimate. The Bill actually has the strong support of the current Information Commissioner, John Edwards.
The Bill will also put in place the foundations for data intermediaries, which are organisations that can help us to benefit from our data. In effect, we will be able to share less sensitive data about ourselves with businesses while securing greater benefits. As I say, one of the examples of this is open banking. Another way in which the Bill will help people to take back control of their data is by making it easier and more secure for people to prove things about themselves once, electronically, without having to dig out stacks of physical documents such as passports, bills, statements and birth certificates and then having to provide lots of copies of those documents to different organisations. Digital verification services already exist, but we want consumers to be able to identify trustworthy providers by creating a set of standards around them.
The Bill is designed not just to boost businesses, support scientists and deliver consumer benefits; it also contains measures to keep people healthy and safe. It will improve the way in which the NHS and adult social care organise data to deliver crucial health services. It will let the police get on with their jobs by allowing them to spend more time on the beat rather than on pointless paperwork. We believe that this will save up to 1.5 million hours of police time each year—
I know that my hon. Friend has been passionate on this point, and we are looking actively into her proposals.
We are also updating the outdated system of registering births and deaths based on paper processes from the 19th century.
Data has become absolutely critical for keeping us healthy, for keeping us safe and for growing an economy with innovative businesses, providing jobs for generations to come. Britain is at its best when its businesses and scientists are at theirs. Right now, our rules risk holding them back, but this Bill will change that because it was co-designed with those businesses and scientists and with the help of consumer groups. Simpler, easier, clearer regulation gives the people using data to improve our lives the certainty they need to get on with their jobs. It maintains high standards for protecting people’s privacy while seeking to maintain our adequacy with the EU. Overall, this legislation will make data more useful for more people and more usable by businesses, and it will enable greater innovation by scientists. I commend the Bill to the House.
It is good finally to get the data Bill that was promised so long ago. We nearly got there in the halcyon days of September 2022, under the last Prime Minister, after it had been promised by the Prime Minister before. However, the Minister has a strong record of bringing forward and delivering things that the Government have long promised. I also know that she has another special delivery coming soon, which I very much welcome and wish her all the best with. She took a lot of interventions and I commend her for all that bobbing up and down while so heavily pregnant. I would also like to send my best wishes to the Secretary of State, who let me know that she could not be here today. I would also like to wish her well with her imminent arrival. There is lots of delivery going on today.
We are in the midst of a digital and data revolution, with data increasingly being the most prized asset and fundamental to the digital age, but this Bill, for all its hype, fails to meet that moment. Even since the Bill first appeared on the Order Paper last September, AI chatbots have become mainstream, TikTok has been fined for data breaches and banned from Government devices, and AI image generators have fooled the world into thinking that the Pope had a special papal puffer coat. The world, the economy, public services and the way we live and communicate are changing fast. Despite these revolutions, this data Bill does not rise to the challenges. Instead, it tweaks around the edges of GDPR, making an already dense set of privacy rules even more complex.
The UK can be a global leader in the technologies of the future. We are a scientific superpower, we have some of the world’s best creative industries and now, outside the two big trading blocs, we could have the opportunities of nimbleness and being in the vanguard of world-leading regulation. In order to harness that potential, however, we need a Government who are on the pitch, setting the rules of the game and ensuring that the benefits of new advances are felt by all of us and not just by a handful of companies. The Prime Minister can tell us again how much he loves maths, but without taking the necessary steps to support the data and digital economy, his sums just do not add up.
The contents of this Bill might seem technical—as drafted, they are incredibly technical—but they matter greatly to every business, consumer, citizen and organisation. As such, data is a significant source of power and value. It shapes the relationship between business and consumers, between the state and citizens, and much, much more. Data information is critical to innovation and economic growth, to modern public services, to democratic accountability and to transforming societies, if harnessed and shaped in the interest of the many, not simply the few—pretty major, I would say.
Now we have left the EU, the UK has an opportunity to lead the world in this area. The next generation of world-leading regulation could allow small businesses and start-ups to compete with the monopolies in big tech, as we have already heard. It could foster a climate of open data, enable public services to use and share data for improved outcomes, and empower consumers and workers to have control over how their data is used. In the face of this huge challenge, the Bill is at best a missed opportunity, and at worst adds another complicated and uncertain layer of bureaucracy. Although we do not disagree with its aims, there are serious questions about whether the Bill will, in practice, achieve them.
Data reform and new regulation are welcome and long overdue. Now that we have left the EU, we need new legislation to ensure that we both keep pace with new developments and make the most of the opportunities. The Government listened to some of the concerns raised in response to the consultation and removed most of the controversial and damaging proposals. GDPR has been hard to follow for some businesses, especially small businesses and start-ups, so streamlining and simplifying data protection rules is a welcome aim. However, we will still need some of them to meet EU data adequacy rules.
The aim of shifting away from tick-box exercises towards a more proactive and systematic approach to regulation is also good. Better and easier data sharing between public services is essential, and some of the changes in that area are welcome, although we will need assurances that private companies will not benefit commercially from personal health data without people’s say so. Finally, nobody likes nuisance calls or constant cookie banners, and the moves to reduce or remove them are welcome, although there are questions about whether the Bill lives up to the rhetoric.
In many areas, however, the Bill threatens to take us backwards. First, it may threaten our ability to share data with the EU, which would be seriously bad for business. Given the astronomical cost to British businesses should data adequacy with the EU be lost, businesses and others are rightly looking for more reassurances that the Bill will not threaten these arrangements. The EU has already said that the vast expansion of the Secretary of State’s powers, among other things, may put the agreement in doubt. If this were to come to pass, the additional burdens on any business operating within the EU, even vaguely, would be enormous.
British businesses, especially small businesses, have faced crisis after crisis. Many only just survived through covid and are now facing rising energy bills that threaten to push them over the edge. According to the Information Commissioner,
“most organisations we spoke to had a plea for continuity.”
The Government must go further on this.
Secondly, the complex new requirements in this 300-page Bill threaten to add more hurdles, rather than streamlining the process. Businesses have serious concerns that, having finally got their head around GDPR, they will now have to comply with both GDPR and all the new regulations in this Bill. That is not cutting red tape, in my view.
Thirdly, the Bill undermines individual rights. Many of the areas in which the Bill moves away from GDPR threaten to reduce protection for citizens, making it harder to hold to account the big companies that process and sell our data. Subject access requests are being diluted, as the Government are handing more power to companies to refuse such requests on the grounds of being excessive or vexatious. They are tilting the rules in favour of the companies that are processing our data. Data protection impact assessments will no longer be needed, and protections against automated decision making are being weakened.
AlgorithmWatch explains that automated decision making is “never neutral.” Outputs are determined by the quality of the data that is put into the system, whether that data is fair or biased. Machine learning will propagate and enhance those differences, and unfortunately it already has. Is my hon. Friend concerned that the Bill removes important GDPR safeguards that protect the public from algorithmic bias and discrimination and, worse, provides Henry VIII powers that will allow the Secretary of State to make sweeping regulations on whether meaningful human intervention is required at all in these systems?
My hon. Friend makes two very good points, and I agree with her on both. I will address both points in my speech.
Taken together, these changes, alongside the Secretary of State’s sweeping new powers, will tip the balance away from individuals and workers towards companies, which will be able to collect far more data for many more purposes. For example, the Bill could have a huge impact on workers’ rights. There are ever more ways of tracking workers, from algorithmic management to recruitment by AI. People are even being line managed by AI, with holiday allocation, the assignment of roles and the determination of performance being decided by algorithm. This is most serious when a low rating triggers discipline or dismissal. Transparency and accountability are particularly important given the power imbalance between some employers and workers, but the Bill threatens to undermine them.
If a person does not even know that surveillance or algorithms are being used to determine their performance, they cannot challenge it. If their privacy is being infringed to monitor their work, that is a harm in itself. If a worker’s data is being monetised by their company, they might not even know about it, let alone see a cut. The Bill, in its current form, undermines workers’ ability to find out what data is held about them and how it is being used. The Government should look at this again.
The main problem, however, is not what is in the Bill but, rather, what is not. Although privacy is, of course, a key issue in data regulation, it is not the only issue. Seeing regulation only through the lens of privacy can obscure all the ways that data can be used and can impact on communities. In modern data processing, our data is not only used to make decisions about us individually but pooled together to analyse trends and predict behaviours across a whole population. Using huge amounts of data, companies can predict and influence our behaviour. From Netflix recommendations to recent examples of surge pricing in music and sports ticketing, to the monitoring of covid outbreaks, the true power of data is in how it can be analysed and deployed. This means the impact as well as the potential harms of data are felt well beyond the individual level.
Moreover, as we heard from my hon. Friend the Member for Salford and Eccles (Rebecca Long Bailey), the algorithms that analyse data often replicate and further entrench society’s biases. Facial recognition that is trained on mostly white faces will more likely misidentify a black face—something that I know the parliamentary channel sometimes struggles with. AI language bots produce results that reflect the biases and limitations of their creators and the data on which they are trained. This Bill does not take on any of these community and societal harms. Who is responsible when the different ways of collecting and using data harm certain groups or society as a whole?
As well as the harms, data analytics offers huge opportunities for public good, as we have heard. Opening up data can ensure that scientists, public services, small businesses and citizens can use data to improve all our lives. For example, Greater Manchester has, over the years, linked data across a multitude of public services to hugely improve our early years services, but this was done entirely locally and in the face of huge barriers. Making systems and platforms interoperable could ensure that consumers can switch services to find the best deal, and it could support smaller businesses to compete with existing giants.
Establishing infrastructure such as a national research cloud and data trusts could help small businesses and not-for-profit organisations access data and compete with the giants. Citymapper is a great example, as it used Transport for London’s open data to build a competitor to Google Maps in London. Open approaches to data will also provide better oversight of how companies use algorithms, and of the impact on the rest of us.
Finally, where are the measures to boost public trust? After the debacle of the exam algorithms and the mishandling of GP data, which led millions of people to withdraw their consent, and with workers feeling the brunt but none of the benefits of surveillance and performance management, we are facing a crisis in public trust. Rather than increasing control over and participation in how our data is used, the Bill is removing even the narrow privacy-based protections we already have. In all those regards, it is a huge missed opportunity.
To conclude, with algorithms increasingly making important decisions about how we live and work, data protection has become ever more important to ensure that people have knowledge, control, confidence and trust in how and why data is being used. A data Bill is needed, but we need one that looks towards the future and harnesses the potential of data to grow our economy and improve our lives. Instead, this piecemeal Bill tinkers around the edges, weakens our existing data protection regime and could put our EU adequacy agreement at risk. We look forward to addressing some of those serious shortcomings in Committee.
I welcome the Bill. I am delighted that it finally takes advantage of one of the freedoms that has resulted from our leaving the European Union, which I supported at the time and continue to support. As has been indicated, the Bill has had a long gestation. I was the Minister at the time of the issue of the consultation paper in September 2021 and the Bill first appeared a year later. As the Opposition spokesman pointed out, a small hiccup delayed it a bit further.
Our current data protection laws originate almost entirely from the EU and are based on GDPR. Before the adoption of GDPR in 2016, the UK Government opposed parts of it. I recall that the assessment at the time was that, although there were benefits to larger companies, there would be substantial costs for smaller firms and indeed that has been borne out. There was a debate in government about whether we should oppose the GDPR regulation when it was going through the process of the Commission formation. As so often was the case in the EU, we were advised that, if we opposed that, we would lose vital leverage and our ability to influence its development. Whether we were able then to influence its development is arguable, but it was decided that we should not outright oppose it. However, it has always been clear that the one-size-fits-all GDPR that currently is in place imposes significant costs on smaller firms. When we had the consultation in 2021, smaller firms in particular complained about the complexity of GDPR, and the uncertainty and cost that it imposed. Clearly, there was seen to be an opportunity to streamline it—not to remove it, but to make it simpler and more understandable, and to reduce some of the burdens it imposes. We now have that opportunity to diverge.
The other thing that came back from the consultation—I agree with the Opposition Members who have raised this point—was that there is an advantage in the UK’s retaining data adequacy with the EU. It was not taken for granted that we would get data adequacy. A lengthy negotiation with the EU took place before a data adequacy agreement was reached. As part of that process, officials rightly looked at what alternative there would be, should we not be granted data adequacy. It became clear that there are ways around it. Standard contractual clauses and alternative transfer mechanisms would allow companies to continue to exchange data. It would be a little more complicated. They would need to write the clauses into contracts. For that reason, there was clearly a value in having a general data adequacy agreement, but one should not think that the loss of data adequacy would be a complete disaster because, as I say, there are ways around it.
The Government are right to look at additional adequacy agreements with countries outside the EU, because therein lies a great opportunity. The EU has managed to conclude some, but not that many, and the Government have rightly identified a number of target countries where we see benefits from achieving data adequacy agreements. It is perfectly possible for us to diverge to a limited extent from GDPR and still retain adequacy. Notably, the EU recognises New Zealand’s regime as being adequate, even though New Zealand’s data protection laws are different from those of the EU. The fact that we decided to appoint the former New Zealand Information Commissioner as our own Information Commissioner means that he brings a particular degree of knowledge about that, which will be very useful.
In considering data protection law, it is sometimes said that there is a conflict between privacy—the right of consumers to have protection of their data—and the innovation and growth opportunities of technology companies. I do not believe that that is true; the two things have to be integral parts of our data protection laws. If people believe that their privacy is at risk, they will not trust the exchange of data. One problem is that, in general, people read only about the problems that arise, particularly from things such as identity theft, hacks and the loss of data as a result of people leaving memory sticks on phones or of cyber-criminals hacking into large databases and taking all their financial information. All those things are a genuine risk, but they present only one side of the picture and, in general, people reach their view about the importance of data protection according to all the risk, without necessarily seeing the real benefits that come from the free exchange of data. That was perhaps the lesson that covid showed us more than any other: by allowing the exchange of data, it allowed us to develop and research vaccines. We were able to research what worked in terms of prevention and the various measures that could be taken to protect consumers from getting covid. Therefore, covid was the big demonstration of the fact that data exchange can bring real benefits to all consumers. We are just on the threshold—
Further to my right hon. Friend’s point about facilitating a trusted mechanism for sharing data, does he agree that the huge global success of open banking in this country has demonstrated that a trust framework not only makes people much more willing to exchange their data but frees up the economy and creates a world-leading sector at the same time?
I agree with my hon. Friend on that. The use of smart data in open banking demonstrates the benefits that can flow from its use, and that example could be replicated in a large number of other sectors to similar benefit. I hope that that will be one benefit that will eventually flow from the changes we are making.
As I say, we are on the threshold of an incredibly exciting time. The use of artificial intelligence and automated decision making will bring real consumer benefits, although, of course, safeguards must be built in. The question of algorithmic bias was looked at by the Centre for Data Ethics and Innovation and there was evidence there. Obviously, we need to take account of that and build in protections against it, but, in general, the opportunities that can flow from making data more easily available are enormous.
I wish to flag up a couple of things. People have long found pop-up banner cookies deeply irritating. They have become self-defeating, because they are so ubiquitous that everybody just presses “yes”. The whole point of them was to acquire informed consent, but that is undermined if everybody is confronted by these things every time they log on to the internet and they automatically press “yes” without properly reading what they are consenting to. Restricting them to cookies that represent intrusive acquisition of data and explaining that to people and requiring consent is clearly an improvement. That will not only make data exchange easier but increase consumer protection, as people will know that they are being asked to give consent because they may choose not to allow their data to be used.
I understand the concerns that have been expressed about the Bill in some areas, particularly about the powers that will be given to the Secretary of State, but this is a complicated area. It is also one where technology is moving very fast. We need flexible legislation to keep up to date with the development of technology, so, to some extent, secondary legislation is probably the right way forward. We will debate these matters in Committee, but, generally, the Bill will help to deliver the Government’s declared intention, which is to make the UK the most successful data-driven technology economy in the world.
We can all agree that the free flow of personal data across borders is essential to the economy, not just within the UK but with other countries, including our biggest trading partner, the EU. Reforms to our data protection framework must have appropriate safeguards in place to ensure that we do not put EU-UK data flows at risk.
Despite the Government’s promises of reforms to empower people in the use of their data, the Bill instead threatens to undermine privacy and data protection. It potentially moves the UK away from the “adequacy” concept in the EU GDPR, and gives weight to the idea that different countries can maintain data protection standards in different but equally effective ways. The only way that we can properly maintain standards is by having a standard across the different trading partners, but the Bill risks creating a scenario where the data of EU citizens could be passed through the UK to countries with which the EU does not have an agreement. The changes are raising red flags in Europe. Many businesses have spoken out about the negative impacts of the Bill’s proposals. Many of them will continue to set their controls to EU standards and operate on EU terms to ensure that they can continue to trade there.
According to conservative estimates, the loss of the adequacy agreement could cost £1.6 billion in legal fees alone. That figure does not include the cost resulting from disruption of digital trade and investments. The Open Rights Group says:
“Navigating multiple data protection regimes will significantly increase costs and create bureaucratic headaches for businesses.”
Although I understand that the Bill is an attempt to reduce the bureaucratic burden for businesses, we are now potentially asking those businesses to operate with two different standards, which will cause them a bigger headache. It would be useful if the Government confirmed that they have sought legal advice on the adequacy impact of the Bill, and that they have confirmed with EU partners that the EU is content that the Bill and its provisions will not harm EU citizens or undermine the trade and co-operation agreement with the EU.
Several clauses of the Bill cause concern. We need more clarity on those that expand the powers of the Home Secretary and the police, and we will require much further discussion on them in Committee. Given what has been revealed over the past few months about the behaviour of some members of the Metropolitan police, there are clauses in the Bill that should cause us concern. A national security certificate that would give the police immunity when they commit crimes by using personal data illegally would cause quite a headache for many of us. The Government have not tried to explain why they think that police should be allowed to operate in the darkness, which they must now rectify if they are to improve public trust.
The Bill will also expand what counts as an “intelligence service” for the purposes of data protection law, again at the Home Secretary's discretion. The Government argue that this would create a “simplified” legal framework, but, in reality, it will hand massive amounts of people’s personal information to the police. This could include the private communications as well as information about an individual’s health, political belief, religious belief or sex life.
The new “designation notice” regime would not be reviewable by the courts, so Parliament might never find out how and when the powers have been used, given that there is no duty to report to Parliament. The Home Secretary is responsible for both approving and reviewing designation notices, and only a person who is “directly affected” by a such a notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, meaning that even those affected would not know it and therefore could not possibly challenge it.
These are expansive broadenings of the powers not only of the Secretary of State, but of the police and security services. If the UK Government cannot adequately justify these powers, which they have not done to date, they must be withdrawn or, at the very least, subject to meaningful parliamentary oversight.
Far from giving people greater power over their data, the Bill will stop the courts, Parliament and individuals from challenging illegal uses of data. Under the Bill, organisations can deny or charge a fee to individuals for the right to access information. The right hon. Member for New Forest East (Sir Julian Lewis) mentioned the difficulty he had with a constituent. I think we can all have some sympathy with that, because many of us have probably experienced similar requests from members of the public. However, it is the public’s right to have access to the data that we hold. If an organisation decides that these requests are “vexatious or excessive”, they can refuse them, but what is “vexatious or excessive”? These words are vague and open to interpretation. Moreover, charging a fee will create a barrier for some people, particularly those on lower incomes, and effectively restricts control of data to more affluent citizens.
The Bill changes current rules that prevent companies and the Government from making solely automated decisions about individuals that could have legal or other significant effects on their lives. We have heard a lot about the potential benefits of AI and how it could be used to enhance our lives, but for public trust and buy-in of AI, we need to know that there is some oversight. Without that, there will always be a question hanging over it. The SyRI case in the Netherlands involved innocuous datasets such as household water usage being used by an automated system to accuse individuals of benefit fraud.
The Government consultation response acknowledges that, for respondents,
“the right to human review of an automated decision was a key safeguard”.
But despite the Government acknowledging the importance of a human review in an automated decision, clause 11, if implemented, would mean that solely automated decision making is permitted in a wider range of contexts. Many of us get excited about AI, but it is important to acknowledge that AI still makes mistakes.
The Bill will allow the Secretary of State to approve international transfers to countries with weak data protection, so even if the Bill does not make data security in the UK weaker, it will weaken the protections of UK citizens’ data by allowing it to be transferred abroad in cases with lower safeguards.
It is useful to hear a couple of stakeholder responses. The Public Law Project has said:
“The Data Protection and Digital Information (No.2) Bill would weaken important data protection rights and safeguards, making it more difficult for people to know how their data is being used”.
The Open Rights Group has said:
“The government has an opportunity to strengthen the UK’s data protection regime post Brexit. However, it is instead setting the country on a dangerous path that undermines trust, furthers economic instability, and erodes fundamental rights.”
Since we are talking about a Bill under the Department for Science, Innovation and Technology, it is important to hear from the Royal Society, which says that losing adequacy with the EU would be damaging for scientific research in the UK, creating new costs and barriers for UK-EU research collaborations. While the right hon. Member for Maldon (Sir John Whittingdale) is right about the importance of being able to share data, particularly scientific data—and we understand the importance of that for things such as covid vaccines—we need to make sure this Bill does not set up further hurdles that could prevent that.
There is probably an awful lot for us to thrash out in Committee. The SNP will not vote against Second Reading tonight, but I appeal to those on the Government Front Bench to give an opportunity for hon. Members to amend and discuss this Bill properly in Committee.
I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.
In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.
I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.
To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.
There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.
However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.
It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”
The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.
People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.
If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.
This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.
Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.
My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?
My right hon. Friend makes an extremely important point. In some ways, we have already seen evidence of that at work: there was a much-talked-about case where Amazon was using an AI system to aid its recruitment for particular roles. The system noticed that men tended to be hired for that role and therefore largely discarded applications from women, because that was what the data had trained it to do. That was clear discrimination.
There are very big companies that have access to a very large amount of data across a series of different platforms. What sort of decisions or presumptions can they make about people based on that data? On insurance, for example, we would want safeguards in place, and I think that users would want to know that safeguards are in place. What does data analysis of the way in which someone plays a game such as Fortnite—where the company is taking data all the time to create new stimuli and prompts to encourage lengthy play and the spending of money on the game—tell us about someone’s attitude towards risk? Someone who is a risk taker might be a bad risk in the eyes of an insurance company. Someone who plays a video game such as Fortnite a lot and sees their insurance premiums affected as a consequence would think, I am sure, that that is a breach of their data rights and something to which they have not given any informed consent. But who has the right to check? It is very difficult for the user to see. That is why I think the system has to be based on the idea that the onus must rest on the companies to demonstrate that what they are doing is ethical and within the law and the established guidelines, and that it is not for individual users always to demonstrate that they have somehow suffered, go through the onerous process of proving how that has been done, and then seek redress at the end. There has to be more up-front responsibility as well.
Finally, competition is also relevant. We need to safeguard against the idea of a walled garden for data meaning that companies that already have massive amounts of data, such as Google, Amazon and Meta, can hang on to what they have, while other companies find it difficult to build up meaningful datasets and working sets. When I was Chairman of the then Digital, Culture, Media and Sport Committee, we considered the way in which Facebook, as it then was, kicked Vine—a short-form video sharing app—off its platform principally because it thought that that app was collecting too much Facebook user data and was a threat to the company. Facebook decided to deny that particular business access to the Facebook platform. [Interruption.] I see that the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully), is nodding in an approving way. I hope that he is saying silently that that is exactly what the Bill will address to ensure that we do not allow companies with big strategic market status to abuse their market power to the detriment of competitive businesses.
I refer the House to my entry in the Register of Members’ Financial Interests.
The Bill has had a curious journey. It started life as the Data Protection and Digital Information Bill, in search of the exciting Brexit opportunities that we were promised, only to have died and then arisen as the Data Protection and Digital Information (No 2) Bill. In the Bill’s rejuvenated—and, dare I say, less exciting—form, Ministers have rightly clawed back some of the most high-risk proposals of its previous format, recognising, of course, that our freedom from the European Union, at least in respect of data protection, is anything but. We may have left the European Union, but data continues to flow between the EU and the United Kingdom, and that means of course that we must keep the European Commission happy to maintain our adequacy decision. For the most part, the Bill does not therefore represent significant change from the existing GDPR framework. There are some changes to paperwork and the appointment of officers, but nothing radical.
With that settled—at least in my view—the question is this: what is the purpose of this Bill? The Government aim to reduce regulatory burdens on business. To give Ministers credit, according to the independent assessment of the Regulatory Policy Committee, they have adequately set out how that will happen—unlike for other Government Bills in recent weeks. I congratulate the Government on their so-called “co-design” with stakeholders, which other Departments could learn from in drafting legislation. But the challenge in reducing business regulation and co-designing legislation with stakeholders is knowing how much of an influence the largest, most wealthy voices have over the smallest, least influential voices.
In this Bill—and, I suspect, in the competition Bill as its relates to the digital markets unit, and, if rumours are correct, the media Bill—that means the difference between the voice of big tech and the voice of the people. If reports are correct, I share concerns about the current influence of big tech specifically on Downing Street and about the amount of interference by No. 10 in the drafting of legislation in the Department. [Interruption.] Ministers are shaking their heads; I am grateful for the clarification. I am sure that the reporters at Politico are watching.
Research is a good example of a concern in the Bill relating to the balance between big tech and the people. When I was on the pre-legislative committee of the Online Safety Bill—on which I enjoyed working with the hon. Member for Folkestone and Hythe (Damian Collins), who spoke before me—everybody recognised the need for independent academics to have access to data from, the social media companies, for example, to help us understand the harms that can come from using social media. The Europeans have progressed that in their EU Digital Services Act, and even the Americans are starting to look at legislation in that area. But in the Bill, Ministers have not only failed to provide this access, but have opted instead to give companies the right to use our data to develop their own products. That means in practice that companies can now use the data they have on us to understand how to improve their products, primarily and presumably so that we use them more or—for companies that rely on advertising income—to increase our exposure to advertising, in order to create more profit for the company.
All that is, we are told, in the name of scientific research. That does not feel quite right to me. Why might Ministers have decided that that was necessary—a public policy priority—or that it is in any way in the interests of our constituents for companies to be able to do corporate research on product design without our explicit consent, instead of giving independent academics the right to do independent research about online harms, for example? The only conclusion I can come to is that it is because Ministers were, in the co-design process, asked by big tech to allow big tech to do that. I am not sure that consumers would have agreed, and that seems to be an example of big tech winning out in the Bill.
The second example relates to consumer rights and the ability of consumers to bring complaints and have them dealt with in a timely manner. Clause 7 allows for unreasonable delays by companies or data controllers, especially those that have the largest quantities of data on consumers. In practice, that once again benefits big tech, which holds the most data. The time that it can take to conclude a complaint under the Bill is remarkably long and will merely act as a disincentive to bringing a complaint in the first place.
It can take up to two months for a consumer or data subject to request access to the data that a company holds on them, then another two months for the company to confirm whether a complaint will be accepted. If a complaint is not accepted, there will then be up to another six months for the Information Commissioner to decide whether the complaint should be accepted, and if the Information Commissioner does decide that, the company then has one more month to provide the data, which was originally asked for nine months earlier. The consumer can then look at the data and put in a complaint to the company. If the company does not deal with the complaint, the earliest that the consumer can complain to the Information Commissioner is month 14, and the Information Commissioner will then have up to six months to resolve the complaint. All in all, that is up to 20 months of emails, forms, processes and decisions from multiple parties for an individual consumer to have a complaint considered and resolved.
That lengthy and complex complaints process also highlights the risks associated with the provisions in the Bill relating to automated decision making. Under current law, fully autonomous decision making is prohibited where it relates to a significant decision, but the Bill relaxes those requirements and ultimately puts the burden on a consumer to successfully bring a complaint against a company taking a decision about them in a wholly automated way. Will an individual consumer really do that when it could take up to 20 months? In the world we live in today, the likes of Chat GPT and other large language models will revolutionise customer service processes. The approach in the Bill seems to fail in regulating for the future and, unfortunately, deals with the past. I ask again: which stakeholder group asked the Government to draft the law in this complex and convoluted way? It certainly was not consumers.
In other regulated sectors and areas of law, such as consumer law, we allow representative bodies to bring what the Americans call “class actions” on behalf of groups of consumers whose rights have been infringed. That process is perfectly normal and exists in UK law today. Experience shows that representative bodies such as Citizens Advice and Which? do not bring class actions easily because it is too financially risky. They therefore bring an action only when there is a clear and significant breach. So why have Ministers not allowed for those powers to exist for breaches of data protection law in the same way that the European Union has, when we are very used to them existing in UK law? Again, that feels like another win for big tech and a loss for consumers. Reducing unnecessary compliance burdens on business is of course welcome, but the Government seem to have forgotten that data protection law is based on a foundation of protecting the consumer, not being helpful to business.
On a different subject, I highlight once again the ongoing creep of powers being taken from Parliament and given to the Executive. We have already heard about the powers for the Secretary of State to make amendments to the legislation without following a full parliamentary process. That keeps happening—not just in this Bill but in other Bills this Session, including the Online Safety Bill. My Committee, which has whole-of-Government scrutiny powers in relation to good regulation, has reprimanded the Department—albeit in its previous form—for the use of those Henry VIII powers. It is disappointing to see them in use again.
The Minister, in response to my hon. Friend the Member for Weaver Vale (Mike Amesbury), said that the Government had enhanced oversight of the Information Commissioner by giving themselves power to direct some of its legitimate interests or decisions, or the content of codes. I politely point out that the Information Commissioner regulates the Government’s use of our data. It seems odd to me that the Government alone are being given enhanced powers to scrutinise the Information Commissioner, and that Parliament has not been given additional oversight; that ought to be included.
The Government have yet to introduce any substantive legislation on biometrics. Biometric data is the most personal type of data, be it about our faces, our fingerprints, our voices or other characteristics that are personal to our bodies. The Bill does not even attempt to bring forward biometric-specific regulation. My private Member’s Bill in the 2019-21 Session—now the Forensic Science Regulator Act 2021—originally contained provisions for a biometrics strategy and associated regulations. At the then Minister’s insistence, I removed those provisions, having been told that the Government were drafting a more wide-ranging biometrics Bill, which we have not seen. That is especially important in the light of the Government’s artificial intelligence White Paper, as lots of AI is driven by biometric data. We have had some debate on the AI White Paper, but it warrants a whole debate, and I hope to secure a Westminster Hall debate on it soon. We need to fully understand the context of the AI White Paper as the Bill progresses through Committee and goes to the other place.
I am conscious that I have had an unusual amount of time, so I will finish by flagging two points, which I hope that the Parliamentary Under-Secretary of State for Science, Innovation and Technology will respond to in his summing-up. The first is the age-appropriate design code. I think that we all agree in this House that children should have more protection online than other users. The age-appropriate design code, which we all welcomed, is based on the foundation of GDPR. There are concerns that the changes in the Bill, including to the rights of the Secretary of State, could undermine the age-appropriate design code. I invite the Minister to reassure us, when he gets to the Dispatch Box, that the Government are absolutely committed to the current form of the age-appropriate design code, despite the changes in the Bill.
The last thing I invite the Minister to comment on is data portability. It will drive competition if companies are forced to allow us to download our data in a way that allows us to upload it to another provider. Say I wanted to move from Twitter to Mastodon; what if I could download my data from Twitter, and upload it to Mastodon? At the moment, none of the companies really allow that, although that was supposed to happen under GDPR. The result is that monopolies maintain their status and competitors struggle to get new customers. Why did the Government not bring forward provision for improved data portability in the Bill? To draw on a thread of my speech, I fear that it may be because that is not in the interests of big tech, though it is in the interests of consumers.
I doubt that I will be on the Bill Committee. I am sorry that I will not be there with colleagues who seem to have already announced that they will be on it, but I am sure that they will all consider the issues that I have raised.
This Bill provides us with yet another opportunity to ensure that our legal and regulatory frameworks are tailored to our needs and specifications, now that we are free from the confines of EU law. It is crucial that we have a data rights regime that maintains the high data protection standards that the public expect, but it must do so in a way that is not overly burdensome to businesses and public services, and does not stifle innovation, growth and productivity. The Bill will go a long way to achieving that, but I would like to focus on one small aspect of it.
Announcing the First Reading of the Bill, the Secretary of State stated that it would improve
“the efficiency of data protection for law enforcement and national security partners encouraging better use of personal data where appropriate to help protect the public. It provides agencies with clarity on their obligations, boosting the confidence of the public on how their data is being used.”—[Official Report, 8 March 2023; Vol. 729, c. 20WS.]
That is a positive step forward for national security, but we are missing a crucial opportunity to introduce further reforms that will reduce administrative burdens on police forces across the UK.
I recently met members of the Leicestershire Police Federation, who informed me of the association’s concerns regarding part 3 of the Data Protection Act 2018. Specifically, the Police Federation is concerned about how the requirements of part 3 interact with the Crown Prosecution Service’s “Director’s Guidance on Charging”, which obliged the police to provide more information to the CPS pre-charge. That information includes unused material, digitally recovered material and third-party material, all of which must be redacted in accordance with the Data Protection Act.
Combined, the guidance’s requirements and the provisions of the Act represent a huge amount of administrative work for police officers, who would have to spend hours making the necessary redactions. Furthermore, much of that work may never be used by the CPS if no charge is brought, or the defendant pleads guilty before trial. Nationally, around 25% of cases submitted to the CPS result in no charge. This desk-based work would remove police officers from the frontline.
Picture the scene of an incident. Say that 10 police officers attend, all turning on their body cameras as they arrive. They deal with different aspects of the incident; they talk to a variety of people and take statements, standing in different positions that result in different backgrounds to the video footage and different side-conversations being captured. The lead officer then spends hours, if not days, redacting all the written data and video footage generated by all the officers, only for the redacted data to be sent to a perfectly trusted source, the CPS, which will not necessarily take the case forward.
The data protection Bill is meant to update and simplify the data protection framework used by bodies in the UK. The Bill refers to the work of the police in national security situations, but it should also cover their day-to-day work as a professional body. They should be able to share their data with the CPS, another professional body. Both have a legitimate interest in accessing and sharing the data collected. My hon. Friend the Minister for Data and Digital Infrastructure will know that this is an issue, as I have already raised it with her. I am very grateful for her considered response, and for the Government’s commitment to looking into this matter further, including in the context of this Bill, and at whether the Police Federation’s idea of a data bubble between the police service and the CPS is a workable solution.
I look forward to working with the Government on the issue. It is vital that we do what we can to ease the administrative burden on police officers, so that we can free up thousands of policing hours every year and get police back to the frontline, where they can support communities and tackle crime. Speaking of easing burdens, may I also take this opportunity to wish my hon. Friend the Minister the very best with the arrival that is expected in, I suspect, the none-too-distant future?
My interest in this debate comes from my representing a science and research city, where data, and transferring it, is key, and from my long-term background in information technology. Perhaps as a consequence of both, back in 2018 I was on the Bill Committee that had the interesting task of implementing GDPR, even though, as my hon. Friend the Member for Bristol North West (Darren Jones)—my good friend—pointed out at the time, none of us had the text in front of us. I think he perhaps had special access to it. In those long and complicated discussions, there were times when I was not entirely sure that anyone in the room fully gripped the complexity of the issues.
I recall that my right hon. Friend the Member for Birmingham, Hodge Hill (Liam Byrne) persistently called for a longer-term vision that would meet the fast-changing challenges of the digital world, and Labour Members constantly noted the paucity of resources available to the Information Commissioner’s Office to deal with those challenges, notwithstanding yellow-vested people entering offices. Five years on, I am not sure that much has changed, because the Bill before us is still highly technical and detailed, and once again the key issues of the moment are being dodged.
I was struck by the interesting conversations on the Conservative Benches, which were as much about what was not being tackled by the Bill as what is being tackled —about the really hot issues that my hon. Friend the Member for Manchester Central (Lucy Powell) mentioned in her Front-Bench speech, such as ChatGPT and artificial intelligence. Those are the issues of the moment, and I am afraid that they are not addressed in the Bill. I make the exact point I made five years ago: there is the risk of hard-coding previous prejudice into future decision making. Those are the issues that we should be tackling.
I chair the all-party parliamentary group on data analytics, which is carrying out a timely review of AI governance. I draw Members’ attention to a report made by that group, with the help of my hon. Friend the Member for Bristol North West, called “Trust, Transparency and Technology”. It called for, among other things, a public services licence to operate, and transparent, standardised ethics and rules for public service providers such as universities, police, and health and care services, so that we can try to build the public confidence that we so need. We also called for a tough parliamentary scrutiny Committee, set up like the Public Accounts Committee or the Environmental Audit Committee, to make sure the public are properly protected. That idea still has strong resonance today.
I absolutely admit that none of this is easy, but there are two particular areas that I would like to touch on briefly. One, which has already been raised, is the obvious one of data adequacy. Again, I do not feel that the argument has really moved on that much over the years. Many of the organisations producing briefings for this debate highlight the risks, and back in 2018—as I think the right hon. Member for Maldon (Sir John Whittingdale) pointed out—there were genuine concerns that we would not necessarily achieve an adequacy agreement with the European Union. Frankly, it was always obvious that this was going to be a key point in future trade negotiations with the EU and others, and I am afraid that that is the way it has played out.
It is no surprise that adequacy is often a top issue, because it is so essentially important, but that of course means that we are weakened when negotiation comes to other areas. Put crudely, to get the data adequacy agreements we need, we are always going to be trading away something else, and while in my opinion the EU is always unlikely to withhold at the very end, the truth is that it can, and it could. That is a pretty powerful weapon. On the research issues, I would just like to ask the Minister whether, in summing up, he could comment on the concerns that were raised back in 2018 about the uncertainty for the research sector, and whether he is confident that what is proposed now—in my view, it should have been done then—can provide the clarity that is needed.
On a more general note, one of the key Cambridge organisations has pointed out to me that, in its view, it is quite hard to see the point of this Bill for organisations that are operating globally because, as the EU GDPR has extraterritorial effect, they are still going to need to meet those standards for much of what they do. It would simply be too complicated to try to apply different legal regimes to different situations and people. That is the basic problem with divergence: when organisations span multiple jurisdictions, taking back control is frankly meaningless. Effectively, it cedes control to others without having any influence—the worst of all worlds. That organisation also tells me that it has been led to believe by the Government, as I think was echoed in some of the introductory points, that any organisation wishing to carry on applying current legal standards will, by default, meet those in the new Bill. It is sceptical about that claim, and it would like some confirmation, because it rightly wonders how that can be the case when new concepts and requirements are introduced and existing ones amended.
There is much, much more that could be said, has been said and will be said by others, including genuine concerns about the weakening of rights around subject access requests and some of the protections around algorithmic unfairness. Those need to be tested and scrutinised in Committee; frankly, too much cannot just be left to ministerial judgment. Huge amounts of data are now held about all of us, and the suspicion is rightly held that decisions are sometimes made without our knowledge, decisions that can have a direct impact on our lives. I think we can all agree that data used well can be transformative and a power for good, but that absolutely relies on confidence and trust, which in turn requires a strong regulatory framework that engenders that trust. It feels to me like this Bill fails to meet some of those challenges. It needs to be strengthened and improved.
It is a pleasure to follow the speech of the hon. Member for Cambridge (Daniel Zeichner), and in fact, I have enjoyed listening to the various contributions about the many aspects of the many-headed hydra that the data Bill represents. In particular, the point made by the hon. Member for Manchester Central (Lucy Powell) about interoperability and the one made by the hon. Member for Glasgow North West (Carol Monaghan) about hurdles are points I will be returning to briefly.
I welcome the fact that we have a Bill that focuses on data. Data is the new oil, as they say, and it is essential that we grapple with the implications of that. If there is need of an example, data was critical in our fight against covid-19. Data enabled the rapid processing of new universal credit applications. Data meant that we could target funds into business accounts quickly to make sure that furlough payments were made. Data gave us regular updates on infection rates, and data underpinned the research into vaccines, their rapid roll-out, and their reporting to the right people, at the right time and in the right place. We have also seen that data on all those matters was questioned at every step of the way then and continuously since.
Data matters. This Bill matters: it gives us an opportunity to redefine our regulatory approach, as the hon. Member for Cambridge alluded to. It also provides a clearer and more stable framework for appropriate international transfers of personal data—I stress the word “appropriate”. In addition, it is welcome that the Bill extends data-sharing powers, enabling the targeting of Government services to support business growth more effectively and deliver joined-up public services, which will be the thrust of my contribution. I also welcome the Bill’s delivery of important changes to our everyday lives. Whether it is an increase in financial penalties for those behind nuisance calls, addressing the number of cookie pop-ups on web browsers that we use every day, or providing a trusted framework for digital verification services, these are important updates in protecting everyday lives that are, in part, lived online now. That is to be welcomed—provided, again, that the necessary safeguards are in place.
I will give the bulk of my time to focusing on another area in which I think the Bill could go much further. The Bill recognises that, for public services to operate efficiently, safely and with effective scrutiny, data should be collected, presented, processed and shared in a consistent way, yet it is frustrating that the current scope of the Bill is for such information standards to apply in England only.
I am going to use health as an example to illustrate my point. In Aberconwy, we are experiencing severe, systematic failings in the delivery of health services across north Wales. The health board has been under special measures for six of the past eight years, and in their latest intervention, the Welsh Government have just sacked the non-executive members of the board. It therefore comes as little surprise that health is the No. 1 domestic concern for constituents across north Wales, or that my constituents put it into our plan for Aberconwy. This is not an exercise in point scoring, but in this Bill, I see an opportunity to help to tackle that problem. Wales is linked to the rest of the UK, historically and today, on an east-west axis for family, business, leisure and public services. Our health and social care services in north Wales rely on working and sharing information with colleagues in England—with hospitals in Chester, Stoke and Liverpool. However, sharing that data, which relies on the interoperability that the hon. Member for Manchester Central referred to, often presents an obstacle to care.
Of course, I recognise and respect that health is a devolved matter that is under the remit of the Welsh Government in Cardiff Bay, but one of the arguments made in favour of Welsh devolution 25 years ago was that it would enable learning from comparisons between different policy approaches across the UK, exposing underperformance as well as celebrating successes. In order to do so, though, we must have comparable and reliable data. If this sounds familiar, I made exactly that point in the debate on the Health and Care Bill back in November 2021. At that time, working with hon. Friends from across north Wales, we showed that we had overwhelming support from patients—they agreed that data must be shared. The healthcare professionals we spoke to also agreed that data needed to be shared. The IT experts we consulted with agreed that data must and could be shared, and the local administrators, community groups and civil servants we spoke to also told us that data needed to be shared. However, the reality is that currently, data in different parts of the UK is often not comparable, nor is the timing of its publication aligned.
Again, I have focused today on health as a pressing and urgent example of the need for sharing data, but these points apply across our public services. Indeed, my hon. Friend the Member for Loughborough (Jane Hunt) gave an excellent and powerful practical example of how data sharing within the police inadvertently introduces all sorts of unnecessary barriers. As much as I have spoken about health, these points apply equally to the education of our children, the wellbeing of our grandparents, skilling our workforce, levelling up our communities, ensuring fair and competitive environments for business across the UK, and more—not least the future of our environment.
I repeat: good data is essential for good services. I recognise the good work that is going on in the Office for National Statistics, with the helpful co-operation of devolved Administrations, but it is time and an opportunity for the Government to consider amending the Bill in Committee to mandate agreement on, and the collection and publication of, key UK-wide data for public services. That data should be timely, accessible and interoperable.
All Administrations will already hold data for the operation of public services, but comparability and interoperability will allow professionals and planners to assign resources and guide interventions where they are needed most. It will allow patients and users of public services to make informed decisions about where to be treated, where to live and where to seek those services. It will also allow politicians like me to be held to account when services fail. I do not believe that such an amendment would divide the House in compassion or in common sense.
In conclusion, I know our Prime Minister understands the importance of data. He seeks to put it at the heart of a modern, innovative, dynamic and thriving UK, but it must be good data that flows through our veins and to all parts of our nation if it is to animate us and make the UK a success. For that reason, we need to go further. We need to ensure data comparability and interoperability across all parts of the UK. I look forward to hearing the Minister’s closing remarks.
I start by echoing the well wishes to the Secretary of State on her imminent arrival. I am delighted to be here in my first outing as the Lib Dem spokesperson for science, innovation and technology, although in my mind I consider it as the spokesperson for proud geeks. I appreciate that is not a term everyone likes, but as a physics graduate and an MP for Oxford, where we have many fellow-minded geeks, I am proud to call myself that.
Much as this important Bill is geeky and technical—it sounds like it will be an interesting Bill Committee —it integrates into our whole lives. People have spoken about the potential and progress, and I agree to an extent with the comment from the hon. Member for Aberconwy (Robin Millar) about this being the new oil. However, in the context of climate change, there is a lesson for us there. Imagine that we knew then what we know now. We can already see that here. As new as some of these technologies are, and as new as some of these challenges may be, it does feel like, as legislators, we are constantly playing catch-up with this stuff.
We consult and we look, and we know what the problems are and what the issue fundamentally is, but I agree with the hon. Member for Cambridge (Daniel Zeichner) that we need a bit of vision here. I would argue that what we need is what my former colleague, the former Member for East Dunbartonshire, called for, which is a code of ethics for data and artificial intelligence. I sincerely hope that the Government, with the extra power to the elbow of the new Department, can put some real resource behind that—not in White Papers and thought, but in a proper bit of legislation that answers some of the questions raised earlier about the moral use, for example, of artificial intelligence in war.
Those are important questions. The problem and worry I have is that this Government and others will find themselves constantly on the back foot, unless we talk not just about the geekery and the technical bits—by the sounds of it, there are enough of us in the House who would enjoy doing that—but about the slightly loftier and more important ways that this Bill will connect with society.
In the digital first age, the Government themselves are encouraging those who want to access benefits and every other part of the state to do so digitally. If someone is to be a full citizen of the state, they are required often to give over their data. If someone does not want to engage with the digital realm, it is difficult for them to access the services to which they are entitled. Those are some of the big issues that encircle this Bill. It is fair to make that point on Second Reading, and I urge the Government, and especially the new Department, to give serious thought to how they will knit this all together, because it is incredibly important.
The Liberal Democrats have a few issues with the Bill. I associate myself with the remarks of the hon. Member for Bristol North West (Darren Jones), and in particular what he said in asking who is at the centre of the Bill, which is incredibly important. As liberals, we believe it should always be the citizen. Where there is a conflict of interest between the citizen, business and the state, in our view and in our political ideology, the citizen always comes top. I am not convinced that has been at the heart of the Bill at points. Citizens have been thought about, but were they at the centre of it at every stage? I am afraid that our ability as individuals to access, manipulate and decide who has our data has at various stages got lost.
The concerns we share with others are in four main areas: the Bill will undermine data rights; it will concentrate power with the Secretary of State—notwithstanding potential change in government, that is the sort of thing that Parliament needs to think about in the round, regardless of who is in power; the Bill will further complicate our relationship with Europe, as some have mentioned; and it sets a worrying precedent.
We need to understand where we start from. Only 30% of people in the UK trust that the Government use their data ethically. That means that 70% of people in the UK do not. Polls across the world have shown roughly the same thing. That is a huge level of mistrust, and we need to take it seriously. The Open Rights Group has described the Bill as part of a deregulatory race to the bottom, as the rights and safeguards of data subjects could be downgraded because of the changes proposed.
Clause 5 and schedule 1 to the Bill introduce a whole set of legitimate interests for processing data without consent and with few controls around their application. The Bill changes the definition of personal data, which would reduce the circumstances in which that information is protected. It reforms subject access requests, as others have said. We all run our own small businesses in our offices as MPs. We understand the burden placed on small businesses in particular, but it is absolutely the right of that individual to find out what is held on them in the way that subject access requests allow. If there is a conflict, it is the right of the individual that needs to be protected. The Government assess that the proposal would save about £82 a year—a price worth paying, given the number of consumers whom those businesses on average are looking after. There is an important hierarchy of user use that is not entirely captured by what the Government have been saying so far.
Big Brother Watch has said:
“The revised Data Protection and Digital Information Bill poses serious threats to Brits’ privacy. The Government are determined to tear up crucial privacy and data protection rights that protect the public from intrusive online surveillance and automated-decision making in high-risk areas. This bonfire of safeguards will allow all sorts of actors to harvest and exploit our data more than ever before. It is completely unacceptable to sacrifice the British public’s privacy and data protection rights on the false promise of convenience.”
I am deeply concerned that far from restoring confidence in data protection, the Bill sets a dangerous precedent for a future in which rights and safeguards are undermined. I have listened to what the Secretary of State has said at the Dispatch Box. I sincerely hope that those safeguards that the Government want to keep in place will remain in place, but we should be listening to those third-party groups that have scrutinised this Bill in some detail. There are legitimate concerns that need to be addressed.
My other concern is the concentration of power with the Secretary of State. As I have said before, while it would be lovely to think that all Secretaries of State and all Governments will all think the same on this and that we all have the same principles, my deep concern is that one day that will not happen. There is an important part for Parliament to play, especially when legislation is running behind what is happening in society, in raising the issues in real time. My worry is that by acting through secondary legislation, which we end up scrutinising less and less often, the Government do not have a mechanism for Parliament to feed in as society changes, which can be year-on-year. We need some way, whether through a Select Committee or whatever, to be able to keep pace with changes in society.
Finally, I want to talk about adequacy and in particular its loss being a real concern. I am pleased to hear that being raised on all sides in the House, which is a good sign, but I hope that this is not a case where little then gets changed in the Bill, as we have seen many times over. We could have it both ways: we can diverge from EU standards if we make the protection of the rights of the citizens stronger. Some who have mentioned divergence, however, have spoken about a weakening, which I worry will lead to a loss of adequacy.
In closing, will the Minister give a cast-iron guarantee to businesses that rely on it—and to our researchers who equally rely on it—that adequacy will not be watered down but will be one of the key tenets of how we move forward? Certainty for businesses and our researchers is incredibly important, and if there is any suggestion that changes in the Bill will affect that, they must be pulled immediately.
It is a pleasure to add some comments and make a contribution, and also to have heard all the right hon. and hon. Members’ speeches as I have sat here tonight. There will not be any votes on the Bill, I understand, but if there had been, my party would have supported the Government, because I think the intention of the Minister and the Government is to try to find a correct way forward. I hope that some of the tweaking that is perhaps needed can happen in a positive way that can address such issues. It is always good to speak in any debate in this House, but this is the first one after the recess, and I am indeed very pleased to be a part of any debates in the House. I have spoken on data protection and its importance in the House before, and I again wish to make a contribution, specifically on medical records and protection of health data with regard to GP surgeries. I hope to address that with some questions for the Minister at the end.
Realistically, data protection is all around us. I know all too well from my constituency office that there are guidelines. There are procedures that my staff and I must follow, and we do follow them very stringently. It is important that businesses, offices, healthcare facilities and so on are aware of the guidelines they must follow, hence the necessity of this Bill. As I have said, if there had been a vote, we would have supported the Government, but it seems that that will not be the case tonight. Data exposure means the full potential for it to fall into the wrong hands, posing dangers to people and organisations, so it is great to be here to discuss how we can prevent that, with the Government presenting the legislation tonight and taking it through Committee when the time comes.
I have recently had some issues with data protection—this is a classic example of how mistakes can happen and how important data can end up in the wrong place—when in two instances the Independent Parliamentary Standards Authority accidentally published personal information about me and my staff online. It did not do it on purpose—it was an accident, and it did retrieve the data very quickly—but it has happened on two occasions at a time of severe threat in Northern Ireland and a level of threat on the mainland as well. Although the matter was quickly resolved, it is a classic example of the dangers posed to individuals.
I am sure Members are aware that the threat level in Northern Ireland has been increased. Despite there being external out-of-office security for Members, I have recently installed CCTV cameras in my office for the security of my staff, which, though not as great in comparison, is my responsibility. I have younger staff members in their 20s who live on their own, and staff who are parents of young children, and they deserve to know that they are safe. Anxieties have been raised because of the data disclosure, and I imagine that many others have experienced something similar.
I want to focus on issues about health. Ahead of this debate, I have been in touch with the British Medical Association, which raised completely valid concerns with me about the protection of health data. I have a number of questions to ask the Minister, if I may. The BMA’s understanding of the Bill is that the Secretary of State or the Minister will have significant discretionary powers to transfer large quantities of health information to third countries with minimal consultation or transparent assessment about how the information will benefit the UK. That is particularly worrying for me, and it should be worrying for everyone in this House. I am sure the Minister will give us some clarification and some reassurance, if that is possible, or tell us that this will not happen.
There is also concern about the Secretary of State having the power to transfer the same UK patients’ health data to a third country if it is thought that that would benefit the UK’s economic interests. I would be very disturbed, and quite annoyed and angry, that such a direction should be allowed. Again, the Minister may wish to comment on that at the end of the debate. I would be grateful if the Minister and his Department provided some clarity for the BMA about what the consultation process will be if information is to be shared with third-party countries or organisations.
There have also been concerns about whether large tech and social media companies are storing data correctly and upholding individuals’ rights or privacy correctly. We must always represent our constituents, and the Bill must ensure that the onus of care is placed on tech companies and organisations to legally store data safely and correctly. The safety and protection of data is paramount. We could not possibly vote for a Bill that undermined trust, furthered economic instability and eroded fundamental rights. Safeguards must be in place to protect people’s privacy, and that starts in the House today with this Bill. Can the Minister assure me and the BMA that our data will be protected and not shared willy-nilly with Tom, Dick and Harry? As I have said, protection is paramount, and we need to have it in place.
To conclude, we have heard numerous stories both from our constituents and in this place about the risks of ill-stored and unprotected data. The Bill must aim to retain high data protection standards without creating unnecessary barriers for individuals and businesses. I hope that the Minister and his Department can answer the questions we may have to ensure that the UK can be a frontrunner in safe and efficient data protection. We all want that goal. Let us make sure we go in the right direction to achieve it.
I would like to add my best wishes to the Minister and the Secretary of State on their imminent arrivals.
We are in the midst of a tech revolution, and right at the centre of this is data. From social media and online shopping to the digitisation of public services, the rate at which data is being collected, processed and shared is multiplying by the minute. This new wealth of data holds great potential for innovation, boosting economic growth and improving the delivery of public services. The aims of the Bill to unlock the economic and societal benefits of data while ensuring strong, future-proofed privacy rights are therefore ones that we support. We welcome, for example, provisions to modernise the ICO structure, and we support provisions for the new smart data regimes, so long as there are clear requirements for impact assessments.
However, the Bill in its current form does not go far enough in actually achieving its aims. Its narrow approach and lack of clarity render it a missed opportunity to implement a truly innovative and progressive data regime. Indeed, in its current form many clarifications will be needed to reassure the public that their rights will not be weakened by the Bill while sweeping powers are awarded to the Secretary of State. Currently, solely automated processing is defined by the Bill as one having “no meaningful human involvement” that results in a “significant decision”, with the Secretary of State trusted with powers to amend what counts within this definition. The lack of detail on the boundaries of such definitions as well as their ability to change over time have concerned the likes of the Ada Lovelace Institute and the TUC.
The Chair of the Business, Energy and Industrial Strategy Committee, my hon. Friend the Member for Bristol North West (Darren Jones), outlined in his powerful speech the power imbalance between big tech and the people, which is an important insight and a challenge for us in this House. Indeed, just this month Uber was found to have violated the rights of three UK-based drivers by firing them without appeal on the basis of fraudulent activity picked up by its automated decision-making system. In its judgment, the court found that the limited human intervention in Uber’s automated decision process was not
“much more than a purely symbolic act”.
This case and the justice the drivers received therefore explicitly relied on current legislation in the form of article 22 of the UK GDPR, and a clear understanding of what constitutes meaningful human involvement. Without providing clear boundaries for defining significant decisions and meaningful human involvement, this Bill therefore risks removing the exact rights that won this case and creating an environment where vital safeguards, such as the right to contest automated decisions and request human intervention, could easily become exempt from applying at the whim of the Secretary of State. This must be resolved, and the public must be reassured that they will not be denied a job, mortgage or visa by an algorithm without a method of redress.
There is also a lack of clarity around how rules allowing organisations to charge a fee or refuse subject access requests deemed “vexatious” and “excessive” will work, as the likes of Which? and the Public Law Project have argued and which my hon. Friend the Member for Cambridge (Daniel Zeichner) highlighted. Indeed, if the list of circumstances where these terms might be met is non-exhaustive, what safeguards will be in place to stop controllers from abusing this, deciding that any request they dislike is vexatious? Organisations should absolutely be supported in directing resources to good faith requests, but we must be careful to ensure that any new limits are protected against abuse.
Reform of the responsibilities of the Information Commissioner’s Office is another area in need of analysis. Indeed, more than evolving its structure, the Bill gives the Secretary of State power to set the strategic priorities of the regulator and approve codes of practice. This has sparked concern across the spectrum of stakeholders, from the Open Rights Group to techUK, over what it means for the regulator’s independence. Given these new powers, particularly in cases where guidance addresses the activity of the Government, how can Ministers assure us that a Secretary of State will not be marking their own homework?
Whether it is the Secretary of State being able to amend the “recognised legitimate interests” list or the removal of the requirement for consultation on impact assessment, this same theme is echoed throughout the Bill, which was raised by the hon. Member for Oxford West and Abingdon (Layla Moran). Without additional guidance and clear examples of how definitions apply, it is hard to grasp the full extent of the consequences of these new measures, especially given the sweeping powers of the Secretary of State to make further changes. We will look to ensure that this clarity is included in the Bill, so that everyone can be assured of their rights and of a truly independent regulator. We must also ensure that children are protected by the Bill and that the age-appropriate design code is not compromised, as raised by the hon. Member for Folkestone and Hythe (Damian Collins) and others across the House.
Clarity on the new regime is also vital for reassuring businesses who still have fears around losing EU adequacy, something raised throughout this debate and which the former Secretary of State the right hon. Member for Maldon (Sir John Whittingdale) outlined in his contribution. The Government have said that they recognise that losing adequacy would be disastrous, costing up to £460 million as a one-off and £410 million every year afterwards. Ministers have rightly rowed back on many of the more concerning suggestions from their consultation, but they must be absolutely clear on how they are sure that the measures in the Bill, particularly those that toy with the regulator’s independence and give Ministers power to create further change, will not threaten adequacy.
Having already made significant adjustments to comply with UK GDPR, the changes in the Bill must also be careful not to create further uncertainty for businesses. Indeed, although Ministers say that anyone who abides by the current rules will still be compliant after the passing of the Bill, organisations will still have to do their own legal due diligence to understand how, if at all, this set of amendments impacts them. It would therefore be good to hear from Ministers on how they plan to ensure that businesses, particularly small and medium-sized enterprises, are supported in understanding the requirements on them.
We understand the Government’s attempts to future-proof this legislation, and it would be great to see an end to constant cookie banners or nuisance calls, which the hon. Member for Aberconwy (Robin Millar) referenced, but the measures in the Bill rely on technology that does not currently operationally exist. In the case of browser-enabled cookie models, there is also the concern that this may entrench power in the hands of existing tech giants and muddy the waters on liability. We must be careful, therefore, to ensure that businesses can actually implement what the Bill requires.
Ultimately, with the exception of the section on smart data, this Bill chooses to take a very narrow view of what an innovative data regime could look like. In the context of a rapidly changing world, this Bill was a great opportunity to really consider how we can get data working in better interests, like those of the general public or small businesses. Labour would have used a Bill like this to, for example, examine how data can empower communities and collective groups such as workers in industries who have long felt that they have been on the wrong end of automated decision-making as well as the automation of jobs.
We would also have sought to improve public trust and understanding in how our data is used, particularly since the willingness to share data has been eroded after the likes of the Cambridge Analytica scandal, the NHS data opt-out, and the exam algorithm scandal, which disproportionately affected my constituents in Barnsley. As it stands, however, the Bill seems only to consider data rights when they emerge as a side product of making changes to rules for processors. Data rights and data protection have wide-ranging consequences across society, as the hon. Member for Strangford (Jim Shannon) discussed. Labour would have used this as an opportunity to look at the larger picture of data ownership. Deregulation measures such as those in the Bill might mean less work for some small businesses, but as long as a disproportionate amount of data is held by a limited number of firms, they will still be at a large competitive disadvantage. From introducing methods of collective redress to nurturing privacy-enhancing technologies, there are many positive opportunities a progressive data Bill could have explored to put our country at the forefront of innovation while genuinely strengthening rights and trust for the modern era, but the Government have missed this opportunity.
Overall, we can all agree on unlocking innovation through data while ensuring data subjects have the rights and trust they fundamentally deserve. However, there are many areas for clarity and improvement if this Bill is to match the bold vision required to truly be at the forefront of data use and data protection. I look forward to working closely with Ministers in the coming months towards legislation that better fulfils these aims.
I thank all Members for their contributions, including the hon. Members for Manchester Central (Lucy Powell), for Glasgow North West (Carol Monaghan), for Bristol North West (Darren Jones), for Cambridge (Daniel Zeichner), for Oxford West and Abingdon (Layla Moran), for Strangford (Jim Shannon) and for Barnsley East (Stephanie Peacock) and my right hon. Friend the Member for Maldon (Sir John Whittingdale) and my hon. Friends the Members for Folkestone and Hythe (Damian Collins), for Loughborough (Jane Hunt) and for Aberconwy (Robin Millar). The debate has been held in the right spirit, understanding the importance of data, and I will try to go through a number of the issues raised.
Adequacy has come up on a number of occasions. We have been straight from the beginning that adequacy is very important and we work with the EU Commission on this; we speak to it on a regular basis, but it is important to note that the EU does not require exactly the same rules to be in place to be adequate. We can see that from Japan and from New Zealand, so we are trying to get the balance right and making sure that we remain adequate not just with the EU but with other countries with which we want to have data bridges and collaboration. We are also making sure that we can strip back some of the bureaucracy not just for small businesses, but for public services including GPs, schools and similar institutions, as well as protecting the consumer, which must always be central.
Automated decision-making was also raised by a number of Members. The absence of meaningful human intervention in solely automated decisions, along with opacity in how those decisions can be reached, will be mitigated by providing data subjects with the opportunity to make representations about, and ultimately challenge, decisions of this nature that are unexpected or seem unwarranted. For example, if a person is denied a loan or access to a product or services because a solely automated decision-making process has identified a high risk of fraud or irregularities in their finances, that individual should be able to contest that decision and seek human review. If that decision is found to be unwarranted on review, the controller must re-evaluate the case and issue an appropriate decision.
Our reforms are addressing the uncertainty over the applications of safeguards. They will clarify when safeguards apply to ensure that they are available in appropriate circumstances. We will develop that with businesses and other organisations in guidance.
The hon. Member for Glasgow North West talked about joint-working designation notices and it is important to note that the police and intelligence services are working off different data regimes and that can make joint-working more difficult. Many of the changes made in this Bill have come from learning from the Fishmongers’ Hall terrorist incident and the Manchester Arena bombing.
Members raised the question of algorithmic bias. We agree that it is important that organisations are aware of potential biases in data sets and algorithms and bias monitoring and correction can involve the use of personal data. As we set out in our response to the consultation on the Bill, we plan to introduce a statutory instrument that will provide for the monitoring and correction of bias in AI systems by allowing the processing of sensitive personal data for this purpose with appropriate safeguards. However, as we know from the AI White Paper we published recently, this is a changing area so it is important that we remain able to flex in Government in the context of AI and that type of decision-making.
The hon. Member for Bristol North West talked about biometrics. That is classed as sensitive data under the UK GDPR, so is already provided with additional protection. It can only be processed if a relevant condition is met under article 9 or schedule 1 of the Data Protection Act. That requirement provides sufficient safeguards for biometric data. There are significant overlaps in the current oversight framework, which is confusing for the police and the public, and it inhibits innovation. That is why the Bill simplifies the oversight for biometrics and overt surveillance technologies.
The hon. Gentleman talked about age-appropriate guidance. We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code. Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office.
The hon. Gentleman also talked about data portability. The Bill increases data portability by setting up smart data regulations. He talked about social media, but it is far wider than that. Smart data is the secure sharing of customer data with authorised third parties on the customer’s request. Those third parties can then use that data to provide innovative services for the consumer or business user, utilising AI and data-driven insights to empower customer choice. Services may include clear account management across services, easier switching between offers or providers, and advice on how to save money. Open banking is an obvious live example of that, but the Bill, with the smart data changes within it, will turbocharge the use of this matter.
My hon. Friend the Member for Loughborough talked about policing. It will save 1.5 million police hours, but it is really important that we do more. We are looking at ways of easing redaction burdens for the police while ensuring we maintain victim and witness confidence. It is really important to them, and in the interests of public trust, that the police do not share information not relevant to a case with other organisations, including the Crown Prosecution Service and the defence. Removing information, as my hon. Friend says, places a resource burden on officers. We will continue to work with the police and the Home Office on that basis.
On UK-wide data standards, raised by my hon. Friend the Member for Aberconwy, improving access to comparable data and evidence from across the UK is a crucial part of the Government’s work to strengthen the Union. The UK Government and the Office for National Statistics have an ongoing and wide-ranging work programme to increase coherency of data across the nations, as my hon. Friend is aware. We remain engaged in discussions and will continue to work with him, the Wales Office and the ONS to ensure that we can continue.
On international data transfer, it is important that we tackle the uncertainties and instabilities in the current regime, but the hon. Member for Strangford is absolutely right that in doing that, we must maintain public trust in the transfer system.
Finally, on the ICO, we believe that the Bill does not undercut its independence. It is really important that, for the trust issues I have talked about, we retain its independence. It is not about Government control over an independent regulator and it is not about a Government trying to exert influence or pressure for what are deemed to be more favourable outcomes. We are committed to the ICO’s ongoing independence and that is why we have worked closely with the ICO. The Information Commissioner himself is in favour of the changes we are making. He has spoken approvingly about them.
This is a really important Bill, because it will enable greater innovation while keeping personal protections to keep people’s data safe.
Question put and agreed to.
Bill accordingly read a Second time.
Data Protection and Digital Information (No. 2) Bill (Programme)
Motion made, and Question put forthwith (Standing Order No. 83A(7)),
That the following provisions shall apply to the Data Protection and Digital Information (No. 2) Bill:
Committal
(1) The Bill shall be committed to a Public Bill Committee.
Proceedings in Public Bill Committee
(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Tuesday 13 June 2023.
(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.
Consideration and Third Reading
(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.
(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.
(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Money)
King’s recommendation signified.
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise the payment out of money provided by Parliament of—
(a) any expenditure incurred under or by virtue of the Act by the Secretary of State, the Treasury or a government department, and
(b) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Ways and Means)
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise:
(1) the charging of fees or levies under or by virtue of the Act; and
(2) the payment of sums into the Consolidated Fund.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Carry-over)
Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)).
That if, at the conclusion of this Session of Parliament, proceedings on the Data Protection and Digital Information (No. 2) Bill have not been completed, they shall be resumed in the next Session.—(Joy Morrissey.)
Question agreed to.
(1 year, 6 months ago)
Public Bill CommitteesBefore we begin, I have a couple of preliminary announcements that Mr Speaker has asked me to draw to your attention. Hansard colleagues would be grateful if Members emailed their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent. Tea and coffee are not allowed during sittings. Date Time Witness Wednesday 10 May Until no later than 9.55 am Information Commissioner’s Office Wednesday 10 May Until no later than 10.25 am Hogan Lovells; London Stock Exchange Group; Centre for Information Policy Leadership Wednesday 10 May Until no later than 10.50 am techUK; Data & Marketing Association Wednesday 10 May Until no later than 11.25 am Connected by Data; Institute for the Future of Work; Ada Lovelace Institute Wednesday 10 May Until no later than 2.25 pm Medtronic; UK Biobank Wednesday 10 May Until no later than 2.50 pm ZILO; UK Finance Wednesday 10 May Until no later than 3.05 pm Better Hiring Institute Wednesday 10 May Until no later than 3.30 pm National Crime Agency; Metropolitan Police Wednesday 10 May Until no later than 3.55 pm Prospect; Trades Union Congress Wednesday 10 May Until no later than 4.25 pm Public Law Project; Law Society of Scotland; Rights and Security International Wednesday 10 May Until no later than 4.40 pm AWO
Today we will first consider the programme motion on the amendment paper. We will then consider a motion to enable the reporting of written evidence for publication and a motion to allow us to deliberate in private about our questions before the oral evidence session. In view of the time available, I hope we can take these matters formally—without debate. The programme motion was discussed yesterday by the Programming Sub-Committee for this Bill.
Ordered,
That—
1. the Committee shall (in addition to its first meeting at 9.25 am on Wednesday 10 May) meet—
(a) at 2.00 pm on Wednesday 10 May;
(b) at 9.25 am and 2.00 pm on Tuesday 16 May;
(c) at 11.30 am and 2.00 pm on Thursday 18 May;
(d) at 9.25 am and 2.00 pm on Tuesday 23 May;
(e) at 9.25 am and 2.00 pm on Tuesday 6 June;
(f) at 11.30 am and 2.00 pm on Thursday 8 June;
(g) at 9.25 am and 2.00 pm on Tuesday 13 June;
2. the Committee shall hear oral evidence in accordance with the following Table:
3. proceedings on consideration of the Bill in Committee shall be taken in the following order: Clauses 1 to 5; Schedule 1; Clause 6; Schedule 2; Clauses 7 to 11; Schedule 3; Clauses 12 to 20; Schedule 4; Clause 21; Schedules 5 to 7; Clauses 22 to 41; Schedule 8; Clauses 42 to 45; Schedule 9; Clauses 46 to 86; Schedule 10; Clauses 87 to 98; Schedule 11; Clause 99; Schedule 12; Clause 100; Schedule 13; Clauses 101 to 114; new Clauses; new Schedules; remaining proceedings on the Bill;
4. the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Tuesday 13 June.— (Sir John Whittingdale.)
Resolved,
That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Sir John Whittingdale.)
Resolved,
That, at this and any subsequent meeting at which oral evidence is to be heard, the Committee shall sit in private until the witnesses are admitted.—(Sir John Whittingdale.)
Copies of written evidence that the Committee receives will be made available in the Committee Room and circulated to Committee members by email. We will now go into private session to discuss lines of questioning.
We are now sitting in public again and the proceedings are being broadcast. Before we hear from the witnesses, do any Members wish to make a declaration of interest in connection with the Bill?
I am not sure whether this is a declaration of interest, so I will mention it just in case. I have had a meeting with Leicestershire Police Federation and I am interested in an amendment that it would like tabled.
I am not sure whether this is directly relevant to the Bill or adjacent to it, but I am an unpaid member of the board of the Centre for Countering Digital Hate, which does a lot of work looking at hate speech in the online world.
Given that one of today’s witnesses is from Prospect, I wish to declare that I am a member of that union.
I am a proud member of a trade union. I refer the Committee to my entry in the Register of Members’ Financial Interests.
I am a proud member of two trade unions.
Should we declare our membership of any union?
We will now hear oral evidence from John Edwards, the Information Commissioner, and Paul Arnold, the deputy chief executive and chief operating officer of the Information Commissioner’s Office. I remind all Members that questions should be limited to matters within the scope of the Bill, and that we must stick to the timings in the programme order, which the Committee has agreed. For this panel, we have until 9.55 am. Will the witnesses please introduce themselves for the record?
John Edwards: Kia ora! My name is John Edwards. I am the Information Commissioner. I took up the job at the beginning of January last year. I was previously the Privacy Commissioner of New Zealand for eight years.
Paul Arnold: I am Paul Arnold, the deputy chief executive and chief operating officer of the ICO. I took up that position in 2016.
May I gently say to the witnesses that this is a big room, so you will need to project your voices so that we can hear your evidence?
Q
John Edwards: The corporation sole model is fit for a number of purposes. That was the structure that I had back home in New Zealand. For an organisation such as the Information Commissioner’s Office, it is starting to buckle under the weight. It will benefit, I think, from the support of a formal board structure, with colleagues with different areas of expertise appointed to ensure that we bring an economy-wide perspective to our role, which as we have heard from the declarations of interest spans almost every aspect of human activity.
There will be some short-term, transitional challenges as we make the transition from a corporation sole to a board structure. We will need to employ a chief executive, for example, as well as getting used to those structures and setting up our new accountability frameworks. But I think, in the longer term, the model proposed in the legislation is well proven across other regulators, both domestically and internationally.
Q
John Edwards: No, I do not.
Q
John Edwards: No, I do not believe it will undermine our independence at all. What I think it will do is to further enhance and promote our accountability, which is very important.
To take your first challenge, about codes of conduct, we worked closely with the Department for Digital, Culture, Media and Sport and subsequently the Department for Science, Innovation and Technology to ensure that we got the appropriate balance between the independence of the commission with the right of the Executive and Parliament to oversee what is essentially delegated lawmaking. I think we have got there. It is not a right to veto out of hand; there is a clear process of transparency, which would require the Secretary of State, in the event that he or she decided not to publish a statutory code that we had recommended, to publish their reasons, and those would be available to the House. I do think there is an appropriate level of parliamentary and Executive oversight of what is, as I say, essentially a lawmaking function on the part of the commission.
Q
John Edwards: I do not believe so. The code of practice would be statutory—it is only the most serious statutory guidance that we would issue, not the day-to-day opinions that we have of the way in which the law operates. But, also, it is a reflection of the commissioner’s view of the law, and a statement as to how he or she will interpret and apply the very general principles. A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.
Q
John Edwards: Yes. We are in the business of statutory interpretation. We are given a law by Parliament. A term like “vexatious” has a considerable provenance and jurisprudence; it is one that I worked with back home in New Zealand. So, yes, I am quite confident that we will be able to apply those.
Q
John Edwards: Sorry, what is your question?
Parts of the Bill refer to there being “meaningful human involvement” and “significant decisions” within automated decision making. That might be in an application for a mortgage or in certain parts of employment. Do you feel that you can interpret those words effectively?
John Edwards: Yes, of course. You are quite right to point out that those phrases are capable of numerous different interpretations. It will be incumbent on my office to issue guidance to provide clarity. There are phrases in the legislation that Parliament could perhaps look at providing clearer criteria on to assist us in that process of issuing guidance—here I am particularly thinking of the phrase “high risk” activities. That is a new standard, which will dictate whether some of the measures apply.
Q
John Edwards: There is an argument that there is nothing under the Bill that they cannot do now, but it does respond to a perception that there is a lack of clarity and certainty about the scope of legitimate interests, and it is a legitimate activity of lawmakers to respond to such perceptions. The provision will allow doubt to be taken out of the economy in respect of aspects such as, “Is maintaining the security of my system a legitimate interest in using this data?” Uncertainty in law is very inefficient—it causes people to seek legal opinions and expend resources away from their primary activity—so the more uncertainty we can take out of the legislation, the greater the efficiency of the regulation. We have a role in that at the Information Commissioner’s Office and you as lawmakers have just as important a role.
Q
John Edwards: You are right that it is the controller’s assessment and that they are entitled to make that assessment, but they need to be able to justify and be accountable for it. If we investigate a matter where a legitimate interest is asserted, we would be able to test that.
Q
John Edwards: Well, through the normal process of investigation, in the same way as we do now. We would ask whether this was in the reasonable contemplation of the individual who has contributed their data as a necessary adjunct to the primary business activity that is being undertaken.
Q
John Edwards: Yes, that is right. But the clarity will be where specific categories of legitimate interest are specified in the legislation. Again, that will just take out the doubt, if there is doubt as to whether a particular activity falls within scope.
Q
John Edwards: I am afraid that I have to revert to the standard, which is, “It depends.” These are questions that need to be determined on a case-by-case basis after examination ex post. It is a very general question that you ask. It depends on what the inferred data is being used for and what it is. For example, my office has taken regulatory action against a company that inferred health status based on purchasing practices. We found that that was unlawful and a breach of the General Data Protection Regulation, and we issued a fine for the practice. Again, the law is capable of regulating inferred data, and there is no kind of carte blanche for controllers to make assumptions about people based on data points, whether collected from or supplied by the individual or not.
Q
John Edwards: I am not aware of the statement she made or the context in which she made it, so it is difficult for me to say whether she agreed it. Certainly, informed consent is not the only lawful basis for a data processing activity and it may be that data about protected activities can be inferred and used in some circumstances. I would be happy to come back to you having checked that quote and to give you my views as to whether I agree with it in the context in which it was made.
Q
John Edwards: I think there is sufficient clarity. I am not sure whether the Bill speaks to the point you have just made, but for me the overarching obligation to use data fairly enables us to make assessments about the legitimacy of the kinds of practices you are describing.
It is a really tight timetable this morning and we have nine minutes left. The Minister wants to ask some questions and there are three Members from the Opposition. I will call the Minister now. Perhaps you would be kind enough, Minister, to leave time for one question each from our three Members of the Opposition.
Q
John Edwards: The obligation to investigate every complaint does consume quite a lot of our resources. Can I ask my colleague to make a contribution on this point?
Paul Arnold: As the commissioner says, that duty to investigate all complaints can challenge us in terms of where we need to dedicate the majority of our resources.
To the previous question and answer, our role in trying to provide or maximise regulatory certainty means being able to invest as much resource as we can in that upstream advice, particularly in those novel, complex, finely balanced, context-specific areas. We are adding far more value if we can add that support upstream.
The additional statutory objectives that are being added through the Bill overall will be a real asset to our accountability. Any regulator that welcomes independence also needs to welcome the accountability. It is the means through which we describe how we think, how we act and the outcomes that we achieve. Those extra statutory objectives will be a real aid to us and also an aid to Parliament and our stakeholders. It really does crystallise and clarify why we are here and how we will prioritise our efforts and resources.
Q
John Edwards: I do not believe there is anything in the Bill that would put at risk the adequacy determination with the European Union. The test the Commission applies is whether the law is essentially equivalent. New Zealand lacks many of the features of the GDPR, as do Israel and Canada, each of which has maintained adequacy status. The importance of an independent regulator is preserved in this legislation. All the essential features of the UK GDPR or the rights that citizens of the European Union enjoy are present in the Bill, so I do not believe that there is a realistic prospect of the Commission reviewing negatively the adequacy determination.
It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.
Q
John Edwards: Yes and no. Yes, I do believe it is an adequate provision, and no, I do not believe there will be an economic barrier to people accessing their information rights.
Q
John Edwards: Yes, I do believe that an empowered citizenry is best placed to enjoy these rights. However, I also believe that the complexity of the modern digital environment creates such an information asymmetry that it is important for strong advocates such as the Information Commissioner’s Office to act as a proxy on behalf of citizenry. I do not believe that we should devolve responsibility to citizens purely to ensure that high standards are set and adhered to in digital industries.
Q
John Edwards: I do not believe so. We have been involved right from the outset. We made a submission on the initial White Paper. We have worked closely with officials. We have said that we want to see the Bill get to a position where I, as Information Commissioner, am able to stand up and say, “I support this legislation.” We have done that, which has meant we have achieved quite significant changes for the benefit of the people of the United Kingdom. It does not mean that we have just accepted what the Government have handed out. We have worked closely together. We have acted as advocates, and I believe that the product before you shows the benefits of that.
Q
John Edwards: In short, yes. We are having discussions about the funding model with DSIT. We are funded by levies. There are two questions: one is about how those levies are set and where the burden of funding our office lies in the economy, and the second is about the overall quantum. We can always do more with more. If you look at the White Paper on artificial intelligence and the Vallance report, you will see that there is a role for our office to patrol the new boundaries of AI. In order to do that, we will have to be funded appropriately, but I have a good relationship with our sponsor Department and am confident that we will be able to discharge all the responsibilities in the Bill.
Gentlemen, thank you very much indeed for your evidence. You can now breathe, relax and enjoy the rest of your day.
Examination of Witnesses
Eduardo Ustaran, Vivienne Artz and Bojana Bellamy gave evidence.
Q
Vivienne Artz: Good morning. My name is Vivienne Artz. I am the chair of the International Regulatory Strategy Group data committee, I have more than 25 years’ experience in financial services, including acting as a chief privacy officer, and I now do advisory work across a range of sectors, including in the context of financial crime.
Will Eduardo Ustaran please introduce himself? Can you hear us, Mr Ustaran? No. Can you hear us, Bojana Bellamy? No. Okay, we will start with our witness who has been kind enough to join us in the room.
Q
Vivienne Artz: Yes, we are interested in implementing a smart data regime because it will allow broader access to data for innovation, particularly in the context of open banking and open finance. It would require access to information, which can often be limited at the moment. There is a lot of concern from businesses around whether or not they can actually access data. Some clarification on what that means, in respect of information that is not necessarily sensitive and can be used for the public good, would be most welcome. Currently, the provisions in the legislation are pretty broad, so it is difficult to see what it will look like, but in theory we are absolutely in favour.
Q
Vivienne Artz: Consumers would absolutely benefit, and that is where our priority needs to be—with individuals. It is an opportunity for them to leverage the opportunities that the data can provide. It will enable innovators to produce more products and services that will help individuals to better understand their financial and personal circumstances, particularly in the context of utility bills and so on. There are a number of positive use cases. There is obviously always the possibility that data can be misused, but I am a great advocate of saying that we need to find the positive use cases and allow business to support society and our consumers to the fullest extent. That is what we need to support.
Q
Vivienne Artz: It is necessary to future-proof the Bill. We are seeing such an incredible speed of innovation and change, particularly with regard to generative artificial intelligence. We need to make sure that the legislation remains technology-neutral and can keep up to date with the changes that are currently taking place.
I have more questions if our other witnesses are with us.
We still have not heard definitively whether our other guests can hear us or speak to us, so we are waiting for confirmation from the tech people. In the meantime, I invite the Minister to question Vivienne Artz.
Q
Vivienne Artz: The Bill provides for the opportunity for the Government to look at a range of issues and to move away from an equivalence approach to one in which we can consider more factors and features. The reality is that if you compare two pieces of legislation, you will always find differences because they come from different cultural backgrounds and different legal regimes. There will always be differences. The approach the UK is taking in the Bill is helpful because it looks at outcomes and broader issues such as the rule of law in different jurisdictions.
What is said on paper is not necessarily what always happens in practice; we need to look at it far more holistically. The legislation gives the Government the opportunity to take that broader and more common-sense view with regard to adequacy and not just do a word-by-word comparison of legislative provisions without actually looking at how the legislation is implemented in that jurisdiction and what other rights can support the outcomes. We can recognise that there is a different legal process and application but ask whether it still achieves the same end. That is what is really important. There is an opportunity not only to move more quickly in this space but to consider jurisdictions that might not be immediately obvious but none the less still offer appropriate safeguards for data.
Q
Vivienne Artz: The current process is incredibly cumbersome for businesses and, if I am honest, it provides zero transparency for individuals as well. It tends to be mostly a paperwork exercise—forgive if that sounds provocative, but putting in place the model clauses is very often an expensive paperwork exercise. At the moment, it is difficult, time-consuming and costly, as the case may be.
The thing with adequacy is that it is achieved at a Government-to-Government level. It is across all sectors and provides certainty for organisations to move forward to share information, sell their goods and services elsewhere and receive those goods and services, and for consumers to access those opportunities as well. Adequacy is certainly the ideal. Whether it is achievable in all jurisdictions I do not know, but I think it is achievable for many jurisdictions to provide confidence for both consumers and businesses on how they can operate.
We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.
Q
Vivienne Artz: I do think the thresholds are appropriate and proportionate. In practice, most organisations do not actually choose to charge, because actually it costs more to process the cheque than it is worth in terms of the revenue. Certainly, some sectors have been subject to very vexatious approaches through claims-management companies and others, where it is a bombarding exercise and it is unclear whether it is in the best interests of the consumers, or whether it is at their understanding and behest, to make a genuine subject access request.
I am a great supporter of subject access requests—they are a way for individuals to exercise their rights to understand what data is being processed—but as a result of quirks of how we operate often in the UK, they are being used as a pre-litigation investigative tool on the cheap, which is unfortunate and has meant that we have had to put in place additional safeguards to ensure they are used for the purpose for which they were provided, which is so that individuals can have transparency and clarity around what data is being processed and by whom.
Q
Vivienne Artz: We have heard from the Information Commissioner that they are fairly clear on what that terminology means and it will reflect the existing body of law in practice. I will be perfectly honest: it is not immediately clear to me, but there is certainly a boundary within which that could be determined, and that is something we would rely on the Information Commissioner to provide further guidance on. It is probably also likely to be contextual.
Q
Vivienne Artz: I think it depends on the sector. I come from the financial services sector, so the types of subject access requests we get tend to be specific to us. I think organisations are going to be reluctant to refuse a subject access request because, at the end of the day, an individual can always escalate to the Information Commissioner if they feel they have been unfairly treated. I think organisations understand their responsibility to act in the best interests of the individual at all times.
Q
Bojana Bellamy: Thank you for inviting me to this hearing. My name is Bojana Bellamy. I lead the Centre for Information Policy Leadership. We are a global data privacy and data policy think-and-do-tank operating out of London, Brussels and Washington, and I have been in the world of data privacy for almost 30 years.
Eduardo Ustaran: Good morning. My name is Eduardo Ustaran. I am a partner at Hogan Lovells, based in London, and I co-lead our global privacy and cyber-security practice, a team of over 100 lawyers who specialise in data protection law all over the world.
Thank you. Chi Onwurah and Damian Collins are lined up to ask questions, but I want first to ask the shadow Minister whether she has any further questions, followed by the Minister. Because we have one witness in the room and two online, please will whoever is asking the question indicate whom you are asking it of?
Q
Bojana Bellamy: Yes, certainly it has been hard to get businesses to comply with GDPR, in particular small and medium-sized businesses. I think the changes proposed in the Bill will make it easier, because it is more about outcomes-based regulation. It is more about being effective on the ground, as opposed to being prescriptive. GDPR is quite prescriptive and detailed. It tells you how to do things. In this new world of digital, that is not very helpful, because technology always goes in front of and faster than the rules.
In effect, what we see proposed in the Bill is more flexibility and more onus on organisations in both the public and private sector to deliver accountability and effective protection for people. It does not tell them and prescribe how exactly to do that, yet they are still accountable for the outcomes. From that perspective, it is a step forward. It is a better regime, in my opinion.
Q
Eduardo Ustaran: From the point of view of adequacy, it is fundamental to acknowledge that data flows between the UK and the EU and the EU and the UK are essential for global commerce and for our digital existence. Adequacy is an extremely valuable element of the way in which the current data protection regime works across both the EU and the UK.
It is really important to note at the outset that the changes being proposed to the UK framework are extremely unlikely to affect that adequacy determination by the EU, in the same way that if the EU were to make the same changes to the EU GDPR, the UK would be very unlikely to change the adequacy determination of the EU. It is important to appreciate that these changes do not affect the essence of UK data protection law, and therefore the adequacy that is based on that essence would not be affected.
Q
Bojana Bellamy: I certainly agree that adequacy is a political decision. In many ways—you have seen this with the Northern Ireland protocol—some of these decisions are made for different purposes. I do not believe there are elements of the Bill that would reduce adequacy; if anything, the Bill is very well balanced. Let me give you some examples of where I think the Bill goes beyond GDPR: certainly, on expectations of accountability on the senior responsible individual, which actually delivers better oversight and leadership over privacy; on the right to complain to an organisation and on organisations to respond to these complaints; and on the strong and effective Information Commissioner, who actually has more power. The regulator is smarter; that, again, is better than GDPR. There are also the safeguards that exist for scientific research and similar purposes, as well as some other detailed ones.
Yes, you will see, and you have seen in public projects as well, that there are people who are worried about the erosion of rights, but I do not believe that exception to subject access requests and other rights we talked about are actually a real erosion. I think it just clarifies what has been the law. Some of the requirements to simplify privacy impact assessment and records of processing will, in fact, deliver better accountability in practice. They are still there; they are just not as prescriptive. The Information Commissioner has strong powers; it is a robust regulator, and I do not believe its independence will be dented by this Bill. I say to those who think that we are reducing the level of protection that, actually, the balance of all the rules is going to be essential equivalency to the EU. That is really what is important.
May I say one more thing quickly? We have seen the EU make adequacy decisions regarding countries such as Japan and Korea, and even privacy shield. Even in these cases, you have not had a situation where the requirements were essentially equivalent. These laws are still different from GDPR—they do not have the right of portability or the concept of automated decision making—but they are still found to be adequate. That is why I really do not believe that this is a threat. One thing we have to keep absolutely clear and on par with the EU is Government access to data for national security and intelligence purposes. That is something the EU will be very interested in to ensure that that is not where the bar goes down, but there is no reason to believe so and there is nothing in the Bill to tell us so.
Vivienne Artz: I concur; I do not think the Bill poses any threat to adequacy with the EU. With regard to the national security issue that Bojana raises, I would also point out that the UN rapporteur noted that the UK has better protections for Government access to data than many EU member states, where it is often a very political approach as opposed to a practical approach and really looking at what the outcomes are. There is nothing in this Bill that would jeopardise adequacy with the EU.
We have 12 minutes left and two Members are indicating that they wish to ask questions after you, Minister.
Q
Eduardo Ustaran: That is a very important question to address because perhaps one of the ways in which we should be looking at this legislative reform is a way of seeing how the existing GDPR framework that exists both in the EU and the UK could, in fact, be made more effective, relevant and modern to deal with the issues we are facing right now. You refer to artificial intelligence as one of those issues.
GDPR in the EU and the UK, is about five years old. It is not a very old piece of legislation, but a number of technological developments have happened in the past five years. More importantly, we have learned how GDPR operates in practice. This exercise in the UK is in fact very useful, not just for the UK but for the EU and the world at large, because it is looking at how to reform elements of existing law that is already in operation in order to make it more effective. That does not mean that the law needs to be more onerous or more strict, but it can be more effective at the same time as being more pragmatic. This is an important optic in terms of how we look at legislative reform, and not only from the UK’s point of view. The UK can make an effort to try to make the changes more visible outside the United Kingdom, and possibly influence the way in which EU GDPR evolves in the years to come.
Bojana Bellamy: I agree that we need a more flexible legal regime to enable the responsible use of AI and machine learning technologies. To be very frank with you, I was hoping the Bill would go a little further. I was hoping that there would be, for example, a recognition of the use of data in order to train algorithms to ensure that they are not discriminatory, not biased and function properly. I would have hoped that would be considered as an example of legitimate interests. That is certainly a way in which the Government can go further, because there are possibilities for the Secretary of State to augment those provisions.
We have seen that in the European AI Act, where they are now allowing greater use of data for algorithmic AI training, precisely in order to ensure that algorithms work properly. We have Dubai’s data protection law and some others are starting to do that. I hope that we have good foundations to ensure further progression of the rules on AI. The rules on automated decision making are certainly better in this Bill than they are in GDPR. They are more realistic; they understand the fact that we going to be faced with AI and machine learning taking more and more decisions, of course with the possibility of human intervention.
Again, to those who criticise the rules, I would say it is more important to have these exposed rights of individuals. We should emphasise, in the way we have done in the Bill, the right to information that there is AI involved, the right to make a representation, the right to contest a decision, and the right to demand human review or human intervention. To me, that is really what empowers individuals and gives them trust that the decisions will be made in a better way. There is no point in prohibiting AI in the way GDPR sort of does. In GDPR, we are going to have something of a clash between the fact that the world is moving toward greater use of AI, and that in article 22 on automated decision making, there is a prohibition that makes it subject to consent or contract. That is really unrealistic. Again, we have chosen a better way.
As a third small detail, I find the rules on research purposes to be smarter. They are rather complicated to read, to be frank, but I look forward to the consolidated, clean version. The fact that technological development research is included in commercial research will enable the organisations that are developing AI to create the rules in a responsible way that creates the right outcomes for people, and does not create harms or risks. To me, that is what matters. That is more important, and that is what is going to be delivered here. We have the exemptions from notices for research and so on, so I feel we will have better conditions for the development of AI in a responsible and trusted way. However, we must not take our eyes off it. We really need to link GDPR with our AI strategy, and ensure that we incentivise organisations to be accountable and responsible when they are developing and deploying AI. That will be a part of the ICO’s role as well.
Five minutes left. This will be the quick-fire round. I have two Members indicating that they wish to ask questions—Chi Onwurah.
Q
Mr Ustaran, please.
Eduardo Ustaran: This is a question that many organisations that operate globally face right now. You must understand that data protection law operates all over the world and data flows all over the world, so consistency is really important in order to achieve compliance in an effective way. Therefore, a question—a very valid question—is, “Do I comply with the EU GDPR across the board, including in the UK, or should I make a difference?”
The reality is that when you look at the way in which the UK data protection framework is being amended, it provides a baseline for compliance with both the UK and EU regimes, in the sense that much of what is being introduced could potentially be interpreted as already being the case in the EU, if you apply perhaps a more progressive interpretation of EU law. Therefore, I think we should look just a little bit further than just saying, “Well, if I do comply with EU law, will I be all right in the UK?”
Maybe the way to look at it—something I see some organisations exploring—is, “If I were to take the UK interpretation of the GDPR on a wholesale basis, would that allow me to operate across the world, and certainly in the EU, in a more effective and efficient but still compliant way?” This is something that companies will be exploring, and it is not as easy as simply saying, “Well, I will just do EU law across the board.”
Sorry. It must be one quick question and one quick answer. We must finish at 10.25 am. Damian Collins.
Q
Vivienne Artz: I think it will help a little bit in terms of the threshold of “vexatious”. I think the other piece that will help is the broadening of the provisions around legitimate interests, because now there is an explicit legitimate interest for fraud detection and prevention. At the moment, it is articulated mostly as to prevent a crime. I would suggest that it could be broadened in the context of financial crime, which has anti-money laundering, sanctions screening and related activities, so that firms can actually process data in that way.
Those are two different things: the one is processing data around sanctioned individuals and such like in the context of suspicious activities, and the other is the right of a subject access to remove their data. Even if they make that subject access request, the ability now to balance it against broader obligations where there is a legitimate interest is incredibly helpful.
I thank all three witnesses for their time this morning and their extremely informative answers to the questions. Our apologies from Parliament for the tech issues that our two Zoom contestants had to endure. Thank you very much indeed. We will now move on to our third panel.
Examination of Witnesses
Neil Ross and Chris Combemale gave evidence.
Q
Neil Ross: Thank you for having us before the Committee. My name is Neil Ross. I am the Associate Director for Policy at techUK, the trade association that represents the technology sector in the UK. We have 950 companies in our membership.
Chris Combemale: I am Chris Combemale, the CEO of the Data and Marketing Association. I have 40 years’ experience as a practitioner in marketing and advertising. I started on the agency side, including well-known brands, leading marketing technology business and first-generation cloud marketing technology.
I apologise for getting your surname pronunciation wrong, Mr Combemale.
Chris Combemale: That’s okay, it happens all the time. It is actually of French heritage, rather than Italian.
Q
“could go further in seeking the full benefits of data driven innovation”.
Does this amended Bill go further?
Neil Ross: Yes, it does. If we go back to the statement of the Information Commissioner earlier, the most important part of the legislation is to provide increased clarity on how we can use data. I think there were about 3,000 responses to the consultation, and the vast majority—particularly around the scientific research and the legitimate interest provisions—focused on providing that extra level of clarity. What the Government have done is quite clever, in that they have lifted examples from the recitals—recital 157, as well as those related to legitimate interests—to give additional clarity on the face of the Bill, so that we can take a much more innovative approach to data management and use in the UK, while still maintaining that within the broad umbrella of what means we qualify for EU adequacy.
Q
Neil Ross: Most tech companies have adapted to GDPR. It is now a common global standard. The Bill makes the compliance burden a little easier to use, allows us to be a little more flexible in interpretation of it and will give companies much more certainty when taking decisions about data use.
One really good example is fraud. Online fraud is a massive problem in the UK and the Government have a strategy to deal with it, so having that legitimate interest that focuses on crime prevention—also those further processing rights around compliance with the law—means that we can be much more innovative and adaptive about how we share and process data to protect against and prevent fraud. That will be absolutely vital in addressing the shared objective that we all have to reduce online fraud.
Q
Neil Ross: No. That is one area where we think further work is needed in the Bill. I think you are referring to clause 85. When we responded to the consultation, we said that the Government should try to create equivalence between the private communications requirements and the GDPR to give that extra level of flex. By not doing that and by not setting out specific cases of where telecoms companies have to identify unsolicited calls, the Government are being really unfair in what they are asking them to do. We have had concerns raised by a range of companies, both large and small, that they might not have the technical capability and that they will have to set up new systems to do it. Overall, we think that the Bill makes a bit of a misstep here and that we need to clarify exactly how it will work. TechUK and some of my colleagues will be suggesting to the Committee some legal amendments for how to do that.
Q
Neil Ross: No, not on that clause, but yes in relation to the rest of the legislation.
Q
Chris Combemale: Yes. First, on the consumer experience, I think that we all recognise that the pop-up consent banners for cookies are generally ticked as a matter of course by consumers who really want to go about their business and get to the website that they want to do business on. In a way, it is not genuine consent, because people are not really thinking deeply about it.
In terms of business, a number of the cookies, which are really identifiers that help you understand what people are doing on your website, are used just on a first-party basis by websites, such as e-commerce websites and business-to-business websites, to understand the basic operational aspects and statistical measurement of how many people are going to which pages. Those are websites that do not take any advertising and do not share any data with third parties, so the exemptions in the Bill generally would make those types of companies no longer need cookie banners while providing no risk to the customers, because the company uses the cookies purely to understand the behaviours of its own website traffic and its own customers. In that sense, we strongly support the provisions and the exemptions in the Bill.
Q
Chris Combemale: I think it can be eventually, but we oppose those provisions in the Bill, because they create a market imbalance and give control as a gateway to large companies that manage browser technology, at the expense of media owners and publishers that are paying journalists and investing in content. It is incumbent upon all else that media owners are able to develop first-party relationships with their audiences and customers to better understand what they need. If anything, we need more control in the hands of the people who invest in creating the content and in paying the journalists who provide those important democratic functions.
Q
Chris Combemale: It certainly would give even greater market control to those companies.
Q
Chris Combemale: I think it could be. For us, the essential principle is that a business, whether a media owner, e-commerce business or publishing business, should have control of the relationships between its products and services and its customers and prospects for its customers. By nature, when you give control to a third party, whether a large tech company or another company, you are getting in between the relationship between people and the organisations that they want to do business with and giving control to an intermediary who may not understand. At the least point, if you register with a website after, for instance, changing your browser setting, that should take precedence over the browser setting: your choice to engage with a particular company should always take precedence over a centralised cookie management system.
Neil Ross: I think that what the Government have done in relation to this is quite clever: they have said that their objective is to have a centralised system in the future, but they have recognised that there are a number of different ongoing legislative and regulatory activities that have a significant bearing on that. I think it was only last week that the Government introduced the Digital Markets, Competition and Consumers Bill, clause 20 of which—on conduct requirements—would play a large role in whether you could set up a centralised system, so there is an element of co-ordinating two different but ongoing regulatory regimes. I think we agree with Chris that the steps on analytical cookies now are good but that we need to have a lot more deep thought about what a centralised system may or may not look like and whether we want to go ahead with it.
Chris Combemale: May I come in on that final point? What makes sense to us is a centralised system for managing opt-outs as opposed to managing consent. As the Data and Marketing Association, we operate the telephone preference service and the mailing preference service, which give consumers the opportunity to opt out from receiving unwanted cold calls or unwanted direct mail. There is already a system in place with digital advertising—an icon that people can use to opt out from the use of personal data for personalising digital ads. I think it makes sense that, if people do not want to receive certain things, they can opt out centrally, but a centralised consent opt-in gives too much control to the intermediaries.
Q
Neil Ross: Smart data is potentially a very powerful tool for increasing consumer choice, lowering prices and giving people access to a much broader range of services. The smart data provisions that the Government have introduced, as well as the Smart Data Council that they are leading, are really welcome. However, we need to go one step further and start to give people and industries clarity around where the Government will look first, in terms of what kind of smart data provisions they might look at and what kind of sectors they might go into. Ultimately, we need to make sure that businesses are well consulted and that there is a strong cost-benefit analysis. We then need to move ahead with the key sectors that we want to push forward on. Similarly to on nuisance calls, we will send some suggested text to the Committee to add those bits in, but it is a really welcome step forward.
Q
Neil Ross: I do not want to name specific sectors at this point. We are having a lot of engagement with our members about where we would like to see it first. The transport sector is one area where it has been used in the past and could have a large use in the future, but it is something that we are exploring. We are working directly with the Government through the Smart Data Council to try to identify the initial sectors that we could look at.
Q
Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.
If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.
Q
Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.
Q
Chris Combemale: I would not want to suggest what the legal definition is. To us in direct marketing and in the Data and Marketing Association, existing customer relationships—loyal customers who trust and are sometimes passionate about the brands they interact with—are low risk. Higher risk is when you come to share data with other companies, but again much of that activity and data sharing is essential to creating relevance. With the right protections, it is not a hugely high-risk activity. Then you can move on up, so the higher the degree of automation and the higher the degree of third-party data, the greater the risk, and you have to put in place mitigations accordingly. I am not a lawyer—I am just a poor practitioner—so I cannot define it from a legal point of view, but it is clear in the context of our industry how risk elevates depending on what you are doing.
Q
Neil Ross: I was going to say that you can see how Chris has interpreted it through the lens of his industry, but the feedback we have had from our members, who operate across a range of industries, suggests that there is quite a lot of confusion about what that terminology might mean. The rest of the Bill aims to clarify elements of the GDPR and put them on the face of the Bill, but this provision seems to be going in the other direction. It raises concern and confusion.
That is why our approach has always been that you are going to get more clarity by aligning the Privacy and Electronic Communications Regulation 2003 more with the GDPR, which has clear legal bases, processes and an understanding of what is high and low risk—a balancing test, and so on—than through this fairly broad and poorly understood term “low risk”. We have concerns about how it will operate across a range of sectors.
Q
Chris Combemale: Coming back to our discussion about legitimate interest and the proportionality balancing test, or legitimate interest impact assessments, when you are thinking about what you are planning to do with your customers, it is a requirement of good marketing without the legislation, but also within the legislation, to think about how what you are planning to do will impact your customers’ privacy, and then to mitigate. The important thing is not to say, “There’s no risk,” “It is low risk,” or “It is high risk”; it is to understand that the higher the risk, the greater the mitigations that you have to put in place. You may conclude that you should not do something because the risk level is too high. That is what balancing tests do, and decisions and outcomes result from them.
Q
Chris Combemale: We do a lot of work combating rogue traders, and we provide evidence to cases from our work with the telephone preference service and other activities. Rogue traders—especially those with criminal intent—will generally ignore the legislation anyway regardless of what you do and whether it lacks clarity or not, but I think you are right. An important part of GDPR is that it puts a lot of responsibility on companies to consider their particular activity, their particular customer base and the nature of their audience. Age UK, a charity that has a lot of vulnerable elderly customers, has to have greater protections and put more thought into how it is doing things than a nightclub marketing to under-30s, who are very technologically literate and digitally conversant.
When we do customer attitudes to privacy studies, we see three broad segmentations—data unconcerned, data pragmatist and data fundamentalist—and they require different treatment. It is incumbent on any company, in a marketing context, to understand who their audience and their customer base is, and design programmes appropriately to build trust and long-term relationships over time. That is an important element of GDPR, from a marketer’s perspective. I should add that it should not take legislation to force marketers to do that.
There are five minutes left and there are two Members seeking to ask questions.
Q
Neil Ross: No, I do not expect so. Given some of the exemptions for further processing, it might help improve compliance with the law, because compliance with the law in the public interest is then a basis on which you could process data further. It might make it easier for companies to implement the age-appropriate design code.
Q
Neil Ross: It just gives additional clarity on when and where you can use data on various grounds. There are a wide range of circumstances that you can run into in implementing the age-appropriate design code, so having more flexibility in the law to know that you can process data to meet a legal objective, or for a public interest, would be helpful. The best example I can give is from the pandemic: the Government were requesting data from telecoms companies and others, and those companies were unsure of the legal basis for sharing that data and processing it further in compliance with a Government or regulator request. The Bill takes significant steps to try and improve that process.
Q
Neil Ross: I do not have one to hand, but we could certainly follow up.
Q
Neil Ross: That is in relation to clause 85?
Q
Neil Ross: We do not think it is particularly appropriate for this scenario, given that the telecoms operators are just informing the ICO about activity that is happening on their service. It is not that they are the bad actors in the first instance; they are having to manage it. Ultimately, the first step is to clarify the aims of clause 85, and then whether the fine is appropriate is a subsequent question.
Q
Neil Ross: It will vary from company to company. Most companies will always seek to comply with the law. If you feel you need some kind of deterrent, that is something for Parliament to consider. The first step is to make sure that the law is really clear about what companies are being asked to do. At the moment, that is not the situation we are in.
Q
Chris Combemale: I think a lot of what our sector does voluntarily—setting aside the legislation—is the creation of what are called permission centres. You will be familiar with them from when you go to a website and it asks about categories of information or products that you are interested in. That allows consumers to express their interest. Within the legislation there is very clear data notification, required at the point that data is collected, which requires companies to ask you what you want to do. Whether it is consent or legitimate interest, consumers always have the right to opt out.
With marketing, there is an absolute right to ask not to receive marketing of any kind, whether that is email, direct mail or telephone, at any time. Companies have an obligation to follow that. When it comes to marketing, which is my subject matter expertise, consumers are very well protected and do exercise their rights to opt out. They are further protected by central services, for example the telephone preference service. That is a law that companies can look up; 70% or so of households have registered their telephone number there. I think there are a large number of protections in place, both through the legislation and voluntarily.
Q
Neil Ross: There has been a big drive among many tech companies to explain better how they use and handle data practices. There is a drive within the sector to do that anyway. Some of that has come from legislative regulatory activity—for example, the Online Safety Bill and other places.
One thing I would say about this legislation is that it does give people more control over data through the privacy management frameworks. By taking a less strict tick-box approach to data-handling practices, there is the opportunity for core sectors or interest groups such as trade unions to put forward what their ideal data-handling practice should be for a company. As long as that complies with what the ICO sets out or the broad guardrails, then you can see a range of different handling practices adopted, depending on which sector you are in. That flexibility gives some power back to consumers and other interest groups.
Gentlemen, you have been brilliant. Thank you very much indeed for your time this morning. We will now move on to the fourth panel.
Examination of Witnesses
Q
Dr Tennison: Thank you very much for inviting me here today. My name is Dr Jeni Tennison. I am the executive director of Connected by Data, which is a campaign to give communities a powerful say in decisions about data. Prior to that I was the CEO of the Open Data Institute. I am also the co-chair of the data governance working group in the Global Partnership on Artificial Intelligence.
Anna Thomas: Good morning and thank you for having me. I am Anna Thomas, a founding director of the Institute for the Future of Work, a research and development institute exploring the impact of new technologies on work and working lives. I was formerly an employment barrister at Devereux Chambers. The institute is also the strategic research partner for the all-party parliamentary group on the future of work.
Michael Birtwistle: Good morning. I am Michael Birtwistle, an associate director at the Ada Lovelace Institute, responsible for law and policy. The Ada Lovelace Institute is an independent research institute with a mission to make sure that data and AI work for people and society. I was previously a policy adviser at the Centre for Data Ethics and Innovation.
Q
Dr Tennison: Surveys and public attitudes polling show that when you ask people about their opinions around the use of data, they have a good understanding about the ways in which it is going wrong, and they have a good understanding about the kinds of protections that they would like to see. The levels of trust are not really there.
A poll from the Open Data Institute, for example, shows that only 30% trust the Government to use data ethically. CDEI has described this as “tenuous trust” and highlighted that about 70% of the public think that the tech sector is insufficiently regulated. I do not think that the Bill addresses those issues of trust very well; in fact, it reduces the power individuals have and also the level of collective representation people can have, particularly in the work context. I think this will diminish trust in the way in which data is used.
Q
Dr Tennison: Obviously, there was a strong consultation exercise around the data reform Bill, as it was then characterised. However, there are elements of this Bill, in particular the recognised legitimate interests that are listed, that have not had detailed public consultation or scrutiny. There are also not the kinds of provisions that we would like to see on ongoing consultation with the public on specific questions around data processing in the future.
Q
Dr Tennison: Subject access requests are an important way in which citizens can work out what is happening within organisations with the data that is being held about them. There are already protections under UK GDPR against vexatious or excessive requests, and strengthening those as the Bill is doing is, I think, going to put off more citizens from making these kinds of requests.
It is worth noting that this is a specific design of the Bill. If you look at the impact assessment, this is where most of the cost to business is being saved; that is being done by refusing subject access requests. So I think we should be suspicious about what that looks like. Where we have been looking at the role of subject access requests in people exercising their rights, it is clear that that is a necessary step, and delays to or refusals of subject access requests would prevent people from exercising their rights.
We think that a better way of reducing subject access requests would be to have publication of things like the risk assessments that organisations have to do when there is high-risk processing—so that there is less suspicion on the part of data subjects and they do not make those requests in the first place.
Q
Anna Thomas: Referring partly to our work in “Mind the gap” and “The Amazonian Era”, as well as the report by the all-party parliamentary group on the future of work about use of AI in the workplace, we would say no. The aim of the Bill—to simplify—is very good. But particular areas in the Bill as it stands—eroded somewhat—are particularly problematic in the workplace. The automated ones that you ask about are really important with regard to the reduction of human involvement. But in addition to that are the need to assess in advance what the risks and impacts are, the requirement for consultation, and the access to relevant information. Those are all relevant and overlap with the automated decision making requirement.
Q
Anna Thomas: Not in themselves. There is potential, in those areas, to correct that or to improve it in the course of the Bill’s proceedings, in order that the opportunities, as well as the risks, of putting this new Bill through Parliament are seized. But, no, because of the transformation of work and the extent of the impact, as well as the risks, that new technologies and automated technologies are having across work, not just on access to work, but on terms, conditions, nature, quality and models for work, the safeguards—there is, I think, increasing cross-party consensus about this—should be, in those areas, moving in the other direction.
Q
Michael Birtwistle: No, we would say that it does not. The Ada Lovelace Institute published a couple of reports last year on the use of biometric data, arguing for a much stronger and coherent regulatory governance framework for biometric technologies. These are a set of technologies that are incredibly personal. We are used to their being talked about in terms of our faces or fingerprints, but actually it is a much wider range, involving any measurement to do with the human body, which can be used in emotional analysis—walking style or gait, your tone of voice or even your typing style. There is also a set of incoming, next-generation AI technologies that rely quite heavily on biometrics, so there is a question about future-proofing the Bill.
We have made two broad proposals. One is to increase the capability of the Information Commissioner’s Office to look specifically at biometrics—for example, to create and maintain a public register of private entities engaging in processing of biometric data, to have a proper complaints procedure, to publish annual reports and so on. There is a set of issues around increasing the capability of our institutions to deal with that.
Then there is a second question about scope. First, the current focus of biometric data and definition is on identifiability of personal data. There are many potentially problematic use cases of biometric data that do not need to know who you are in order to make a decision about you. We think it would be wise and would future-proof the regulation of this powerful technology to also include classification or categorisation as the purpose of those biometric technologies.
Q
I am interested in the views of the other members of the panel as well. Do you think there needs to be a greater onus on data controllers to make clear to regulators what data they are gathering, how they are processing it and what decisions are being made based on that data, so that, particularly in an automated environment, while there may not be a human looking at every step in the chain, ultimately a human has designed the system and is responsible for how that system is working?
Michael Birtwistle: I think that is a really important point that is going to be very relevant as we read this Bill alongside the AI White Paper provisions that have been provided. Yes, there is definitely a need for transparency towards regulators, but if we are thinking about automated decision making, you also want a lot of the safeguards and the thinking to be happening within the firms on a proactive basis. That is why the provisions for automated decision making within the Bill are so important. We have concerns around whether the more permissive automated decision making approach in the Bill is actually going to lead to greater harms occurring as, effectively, it turns the making of those automated decisions from a sort of prohibition with exceptions into something that, for anything other than special category data, is permitted with some safeguards, which again there are questions around.
Q
Michael Birtwistle: Legitimate interest still has a balancing test within it, so you would not necessarily always be able to show that you had passed that test and to do whatever you want but, certainly, the provisions in the Bill around automated decisions bring legitimate interest into scope as something that it is okay to do automated processing around.
Dr Tennison?
Dr Tennison: On your first point, around the targets of decisions, one of the things that we would really argue for is changing the sets of people who have rights around automated decision making to those who are the subject of the decisions, not necessarily those who data is known about for those decisions. In data governance practice, we talk about these people as being decision subjects, and we think it is they who should have the rights over being informed about when automated decision making is happening, and other kinds of objection and so forth. That is because, in some circumstances, as you said, there might be issues where you do not have information about someone and nevertheless you are making decisions about them, or you have information about a subset of people, which you are then using to make a decision that affects a group of people. In those circumstances, which we can detail more in written evidence, we really need to have the decision subjects’ rights being exercised, rather than the data subjects’ rights —those who the data is known about.
On the legitimate interest point you raised, there is this balancing test that Michael talked about, that balances the interests of data subjects as well. We think that there should also be some tests in there that balance public interests, which may be a positive thing for using data, but also may be a negative thing. We know that there are collective harms that arise from the processing of data as well.
Q
Dr Tennison: Yes, it could be, or because they are using a specific browser, they are in a particular area from their IP or something like that. There are various ways in which people can be targeted and affected by those decisions. But we are not just talking about targeted advertising; we are talking about automated decisions in the workplace or automated decisions about energy bills and energy tariffs. There are lots of these decisions being made all the time.
Q
Dr Tennison: Yes. Or they may be subject to things like robo-dismissal, where their performance is assessed and they get dismissed from the job, or they are no longer given jobs in a gig economy situation.
Q
Dr Tennison: Yes.
I can see Anna Thomas chomping at the bit.
Anna Thomas: I would back up what Jeni is saying about group impacts in the workplace context. It is very important that individuals know how systems are used, why and where they have significant effects, and that risks and impacts are ascertained in advance. If it is just individuals and not groups or representatives, it may well not be possible to know, ascertain or respond to impacts in a way that will improve and maximise good outcomes for everybody—at an individual level and a firm level, as well as at a societal level.
I can give a few examples from work. Our research covers people being told about the rates that they should hit in order to keep their job, but not about the factors that are being taken into account. They are simply told that if you are not hitting that, you will lose your job. Another example is that customer interaction is often not taken into account, because it is not something that can be captured, broken down and assessed in an automated way by an algorithmic system. Similarly, older workers—they are very important at the moment, given that we need to fill vacancies and so on—are feeling that they are being “designed out”.
Our research suggests that if we think about the risks and impacts in advance and we take proportionate and reasonable steps to address them, we will get better outcomes and we will get innovation, because innovation should be more than simply value extraction in the scenarios that I have set out. We will improve productivity as well. There is increasing evidence from machine learning experts, economists and organisational management that higher levels of involvement will result in better outcomes.
Mr Birtwistle?
Michael Birtwistle: I very much agree with my other panellists on those points. If you are thinking about concrete ways to improve what is in the Bill, the high level of protection around automated decision making is currently in article 22B. That looks at decisions using special category data, which, as an input, you could also add in there, looking at the output. You could include decisions that involve high-risk processing, which is already terminology used throughout the Bill. That would mean that, where automated decision making is used around decisions that involve high-risk processing, you would need meaningful human involvement, explicit consent or substantial public interest.
Q
Dr Tennison: The main thing that we have been arguing for is that it should be the wider set of decision subjects, rather than data subjects, who get rights relating to notification, or who can have a review. It is really important that there be notification of automated decision making, and as much transparency as possible about the details of it, and the process that an organisation has gone through in making an impact assessment of what that might mean for all individuals, groups and collective interests that might be affected by that automated decision making.
Q
Dr Tennison: I do not think it is a matter of notifying people about all automated decision making. The Bill suggests limiting that to legally or otherwise significant decisions, so that we have those additional rights only as regards things that will really have an impact on people’s lives.
Q
Dr Tennison: I am not comfortable that they are directed to the right people.
Q
Dr Tennison: Yes.
Anna, did you want to come in on that?
Anna Thomas: The last question about the threshold is really important, and it tends to suggest that work should have separate consideration, which is happening all over the world. Last week, Canada introduced its automated decision-making directive, and extended it to work. We have been working with it on that. Japan has a strategy that deals expressly with work. In the United States there are various examples, including the California Privacy Rights Act, of rules that give work special attention in this context. Our proposal for addressing the issue of threshold is that you should always provide notification, assess, and do your best to promote positive impacts and reduce negative ones if the decision-making impacts access to work, termination, pay, contractual status or terms, and, for the rest, when there is significant impact.
Q
Anna Thomas: Yes, absolutely. In our model, we suggest that the impact assessment should incorporate not just the data protection elements, which we say remain essential, but equality of opportunity and disparity of outcome—for example, equal opportunity to promotion, or access to benefits. That should be incorporated in a model that forefronts and considers impacts on work.
Q
Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.
There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.
Q
Michael Birtwistle: My colleagues could not be here, unfortunately, but they would have been better representatives in that sense.
I want to touch on the equality issue again. A 2019 UN report on the digital welfare state made the point that algorithms repeat existing biases and entrench inequalities. How do we get around that? There are a lot of issues around trust and people’s rights and protections when it comes to this data. On top of those, there is this issue. Does the legislation address that? How can we overcome it?
Dr Tennison: As I have mentioned, there need to be more points in the Bill where explicit consideration of the public interest, including equality, is written into the sets of considerations that organisations, the ICO and the Secretary of State need to take into account when they are exercising their rights. That includes ensuring that public interest and equality are an explicit part of assessments of high-risk processing. That will help us to make sure that in the assessment process, organisations are made to look beyond the impacts on individuals and data subjects, and to look at the whole societal and economic impacts—even at the environmental impacts—that there might be from the processing that they are looking to carry out.
Anna Thomas: I agree. To add to what I said before, it would help to require a technical bias audit as well as a wider equality impact assessment. One idea that you may wish to consider is this: in the same way that the public sector has an obligation sometimes to consider the reduction of wider inequalities, you could have—well, not a full private sector model requiring that; that may need to be built up over time. We could, at the very least, require consideration of the desirability of reducing inequalities of opportunity and outcome as part of determining our reasonable and proportionate mitigations in the circumstances; that would be easy to do.
Michael Birtwistle: I agree. There is also a question about institutional capability—ensuring that the institutions involved have the capability to react to the use of these technologies as they evolve. Specifically, it would be great to see the ICO asked in the Bill to produce guidance on how the safeguards in article 22C are to be implemented, as that will have a large effect on how automated decision making will be lived in practice and built into firms. The powers reserved for Ministers around interpreting meaningful human involvement, and legal and similarly significant effect, will also have a big impact. It would make more sense for that to be with the ICO.
Q
Michael Birtwistle: Yes, if regulators are not properly empowered.
Anna Thomas: I strongly agree, but they could be properly empowered and resourced, and in some instances given extra powers to interrogate or to redress what they have found. We advised that there should be a forum in 2020, and are delighted to see the Digital Regulation Cooperation Forum. That could be given additional resources and additional bite, and we would certainly like to see work forefronted and involved in activities. The forum would be well placed, for example, to provide dedicated cross-cutting guidance on impacts in work.
Dr Tennison: I agree with the other panellists. The only thing I would add is that I think that the involvement of the public will be absolutely essential for moving trust forward in those circumstances.
Q
Great. Ms Thomas, presumably all the automated decisions will be subject to employment law. Would employees have the information they need to appeal decisions and take them to an industrial tribunal?
Dr Tennison: You asked what kind of abuse I am particularly concerned about. I echo some of Anna’s concerns around the work context and what that looks like. We have recently been doing some case studies, which again I can share, and they really bring home the kinds of issues that workers are subject to as automated decision making is rolled out in organisations.
More broadly, though, I am concerned about the gradual drift of reducing trust in the public sphere when it comes to the use of data by Governments and organisations. In some ways, I am more concerned about this leading to people not adopting technology and opting out of data collection because they are worried about what might happen. That would hold us back from the progress and the good uses of data that I would really like to see.
Michael Birtwistle: I agree with that very much. We need to think about past public concern around GP data sharing, contact tracing and the Ofqual exams algorithm. When people see their data being used in unexpected ways, or in ways that make them feel uncomfortable, they withdraw their consent and support for that use, and we as a society lose the benefits that data-driven technology can bring.
Anna Thomas: Employment law and the other laws in that context certainly help in some areas; for example, there is unfair dismissal protection, and redundancy protection under the information and consultation regulations. However, it is a patchwork, and it is not clear. Clarity is needed for businesses, to reassure people at work that the principles in the AI White Paper ultimately apply to their data, and to promote prosperity and wellbeing as widely as possible.
I thank our three witnesses very much indeed; you have all been fantastic. We are very grateful to you for being here. That brings us to the end of our morning session. The Committee will meet again at 2 o’clock, here in the Boothroyd Room, to continue taking oral evidence. We heard from 10 witnesses this morning and will hear from 13 this afternoon.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesWelcome back. We are now on to our fifth witness panel and we will hear from Tom Schumacher, chief privacy officer at Medtronic, who has kindly joined via Zoom, and Jonathan Sellors, legal counsel and company secretary at UK Biobank, who is in the room. We have until 2.25 pm for this panel. Could the witnesses please introduce themselves for the record?
Jonathan Sellors: Good afternoon. I am Jonathan Sellors, general counsel of UK Biobank. To those who may not know, we are the largest globally accessible clinical research resource in the world. We comprise 500,000 UK-based participants, and we make de-identified data available to researchers to conduct clinical research in the public interest.
Tom Schumacher: Thank you so much for inviting me. I am Tom Schumacher, and I work for Medtronic as the chief data and privacy counsel. Medtronic is the world’s largest medical device maker, with 90,000 employees around the world and three manufacturing sites in the UK. We are headquartered in Ireland.
Q
Jonathan Sellors: I am not sure I am the expert on this particular topic, because my experience is more research-based than in IT systems embedded in clinical care.
Tom Schumacher: I am also not as intimately familiar with that issue, but I would say that interoperability is absolutely critical. One of the challenges we experience with our technologies—I assume this is also the case for your health providers—is the ability to have high-quality data that means the same thing in different systems. That is a challenge that will be improved, but it is really a data challenge more than a privacy challenge. That is how I see it.
Q
Jonathan Sellors: I think it is a thoroughly useful clarification of what constitutes research. It is essentially welcome, because it was not entirely clear under the provisions of the General Data Protection Regulation what the parameters of research were, so this is a helpful clarification.
Tom Schumacher: I completely concur: it is very useful. I would say that a couple of things really stand out. One is that it makes it clear that private industry and other companies can participate in research. That is really important, particularly for a company like Medtronic because, in order to bring our products through to help patients, we need to conduct research, have real-world data and be able to present that to regulators for approval. It will be extremely helpful to have that broader definition.
The other component of the definition that is quite helpful is that it makes it explicit that technology development and other applied research constitutes research. I know there is a lot of administrative churn trying to figure out what constitutes research and what does not, and I think this is a really helpful piece of clarification.
Q
Tom Schumacher: Maybe I can give an example. One of the businesses we purchased is a business based in the UK called Digital Surgery. It uses inter-body videos to try to improve the surgery process and create technologies to aid surgeons in prevention and care. One of the challenges has been, to what extent is the use of surgery videos to create artificial intelligence and a better outcome for patient research? Ultimately, it was often the case that a particular site or hospital would agree, but it created a lot of churn, activity and work back and forth to explain exactly what was to be done. I think this will make it much clearer and easier for a hospital to say, “We understand this is an appropriate research use” and to be in a position to share that data according to all the protections that the GDPR provides around securing and de-identifying the data and so on.
Jonathan Sellors: I think our access test, which we apply to all our 35,000 users, is to ensure they are bona fide researchers conducting health-related research in the public interest. We quite often get asked whether the research they are planning to conduct is legitimate research. For example, a lot of genetic research, rather than being based on a particular hypothesis, is hypothesis-generating—they look at the data first and then decide what they want to investigate. This definition definitely helps clear up quite a few—not major, but minor—confusions that we have. They arise quite regularly, so I think it is a thoroughly helpful development to be able to point to something with this sort of clarity.
Q
Jonathan Sellors: The short answer would be yes. I was contacted by NHS England about the wording of some of the consent aspects, some of the research aspects and particularly some of the pseudonymisation aspects, because that is an important wall. Most research conducted is essentially on pseudonymised rather than identifiable data. The way it has been worded and clarified, because it makes an incremental improvement on what is already there in the GDPR, is very useful. I think it is a good job.
Tom Schumacher: Yes, I would say the same. NHS Transformation and the Department for Culture, Media and Sport, particularly Owen Rowland and Elisabeth Stafford, have been very willing to hear points of view from industry and very proactive in reaching out for our feedback. I feel like the result reflects that good co-ordination.
Q
Jonathan Sellors: Yes, I think it is reasonably clear.
What do you mean by that?
Jonathan Sellors: Like any lawyer, if I were asked to draft something, I would probably always look at it and say I could possibly improve it. However, I would actually look at this and say it is probably good enough.
Q
Jonathan Sellors: If I may, can I come back to you on that with a written response, when I have given it slightly further consideration? Would that be okay?
Q
Jonathan Sellors: I think that, with health-related research that is in the public interest, it is relatively straightforward to spot what it is. Most research is going to have some commercial application because most of the pharma, molecules and medical devices are going to be commercially devised and developed. I do not think that the fact that something has a commercial interest should count it out in any way; it is just about looking at what the predominant interest is.
Q
Jonathan Sellors: Right, thank you. I understand.
Tom Schumacher: I concur with what the previous speaker said. In the medical device industry, we really focus on what is considered more traditional research, which fits well within the refined research definition that the Bill contains.
Q
Jonathan Sellors: I do not think I am really the best qualified person to talk about the different Android and Apple operating systems, although we did a lot of covid-related work during the pandemic, which we were not restricted from doing.
Tom Schumacher: I would say that this comes up quite a lot for Medtronic in the broader medtech industry. I would say a couple of things. First, this is an implementation issue more than a Bill issue, but the harmonisation of technical standards is absolutely critical. One of the challenges that we, and I am sure NHS trusts, experience is variability in technical and IT security standards. One of the real opportunities to streamline is to harmonise those standards, so that each trust does not have to decide for itself which international standard to use and which local standard to use.
I would also say that there is a lot of work globally to try to reach international standards, and the more that there can be consistency in standards, the less bureaucracy there will be and the better the protection will be, particularly for medical device companies. We need to build those standards into our product portfolio and design requirements and have them approved by notified bodies, so it is important that the UK does not create a new and different set of standards but participates in setting great international standards.
Q
Jonathan Sellors: I think that it is absolutely right to be concerned about whether there will be issues with adequacy, but my evaluation, and all the analysis that I have read from third parties, particularly some third-party lawyers, suggests that the Bill does not or should not have any impact on the adequacy decision at all—broadly because it takes the sensible approach of taking the existing GDPR and then making incremental explanations of what certain things actually mean. There are various provisions of GDPR—for example, on genetic data and pseudonymisation—that are there in just one sentence. It is quite a complicated topic, so having clarification is thoroughly useful, and I do not think that that should have any impact on the adequacy side of it. I think it is a very important point.
Tom Schumacher: I agree that it is a critical point. I also feel as though the real value here is in clarifying what is already permitted in the European GDPR but doing it in a way that preserves adequacy, streamlines and makes it easier for all stakeholders to reach a quick and accurate decision. I think that adequacy will be critical. I just do not think that the language of the text today impacts the ability of it to be adequate.
Q
Jonathan Sellors: I think that data sharing, of one sort or another, absolutely underpins medical research. You need to be able to do it internationally as well; it is not purely a UK-centric activity. The key is in making sure that the data that you are using is properly de-identified, so that research can be conducted on patients, participants and resources in a way that does not then link back to their health data and other data.
Q
Jonathan Sellors: Let me put it this way: poor-quality research, undertaken in an unfortunate way, is always going to be a problem, but good-quality research, which has proper ethical approval and which is done on data that is suitably managed and collated, is an essential thing to be able to do.
Q
Jonathan Sellors: Approval by the relevant ethics committee.
Q
Jonathan Sellors: I do not think that it is a requirement of this Bill, but it is a requirement of pretty much most research that takes place in the UK.
Q
“reasonably be described as scientific”
research. You would see concerns, then, if data was to be shared for research that was carried out outside of ethics committee approvals. I do not want to put words into your mouth, but I am just trying to understand.
Jonathan Sellors: Sure. I think it depends on the nature of the data that you are trying to evaluate. In other words, if you are looking at aggregated or summary datasets, I do not think there is any particular issue, but when you are looking at individual-level data, that has to be suitably de-identified in order for research to be safely conducted.
Q
Jonathan Sellors: There is always a risk, but I think the way it is expressed in the Bill is actually quite measured. In other words, it takes a reasonable approach to what steps can constitute re-identification. There are a certain police-related examples whereby samples are found on crime scenes. The individuals can be identified, certainly, if you are on the police database, but if they are not on a reference database, it is extremely difficult to re-identify them, other than with millions of pounds-worth of police work. For all practical purposes, it is actually de-identified. Saying something is completely de-identified is quite difficult.
Q
Jonathan Sellors: I definitely recognise that. That is one of our principal bits of concern, but usually the identifiers are the relatively simple ones. In other words, you can re-identify me quite easily by my seven-digit postcode and my age and my gender. Obviously, when we release data, we make sure not to do that. Releasing quite a big bit of my genetic sequence does not make me re-identifiable.
Currently.
Jonathan Sellors: Currently—I accept that.
Tom Schumacher: I would say a couple of things. It is important to know that the Bill preserves the full array of safeguards in the GDPR around data minimisation, access controls and making sure that you have de-identified the data as much as possible for the purpose you are going to use it for. The opportunity that our company is quite concerned about is that, without some elements of real-world data, we are not going to be able to eliminate the bias that we see in the system. We are not going to be able to personalise medicine, and we are not going to be able to get our products approved, because our regulating bodies are now looking at and mandating that the technology we use is tested in different attributes that are relevant for that technology.
As an example, there are very few data pieces that we need for our digital surgery business, but we might need gender, weight and age. The Bill will allow customisation to say, “Okay, what are you going to do to make sure that only two or three data scientists see that data? How are you going to house it in a secure, separate environment? How are you going to make sure that you have security controls around that?” I think the Bill allows that flexibility to try to create personalised medicine, but I do not believe that the Bill opens up a new area of risk for re-identification provided that the GDPR safeguards remain.
Q
Tom Schumacher: In isolation, that would be a risk, but in the full context of the interrelationship between the data owner and controller and the manufacturer, there would be a process by which you would define the legitimate use you are going to use that data for, and that would be something that you would document and would go on your system. I do not believe that using data for political purposes would constitute research in the way that you would think about it in this Bill. Certainly the UK ICO is well regarded for providing useful interpretation guidance. I think that that office would be able to issue appropriate guardrails to limit those sorts of abuses.
Jonathan Sellors: If you look at a scientific hypothesis, it might not be a scientific hypothesis that you like, but it is much better to have it out there in the public domain, where the data that underpins the research can be evaluated by everybody else to show that it is not sound and is not being conducted appropriately.
Q
Jonathan Sellors: There has to be some element of scientific flexibility, but scientists themselves have to be able to make a decision about what they wish to investigate. The main thing to ensure is that it is transparent—in other words, somebody else can see what they have done and the way in which they have done it, so that if it does come up with a conclusion that is fundamentally flawed, that can be properly challenged.
If there are no further questions, may I thank both of you gentlemen very much indeed for your time this afternoon and for giving us your evidence. It is hugely appreciated. We now move on to the sixth panel.
Examination of Witnesses
Harry Weber-Brown and Phillip Mind gave evidence.
Welcome, gentlemen. We will now hear from Harry Weber-Brown, chief engagement officer at ZILO, and Phillip Mind, director of digital technology and innovation at UK Finance. We have until 2.50pm for this session. I now invite the witnesses to please introduce themselves to the Committee for the record, starting with Mr Weber-Brown.
Harry Weber-Brown: Thank you very much. My name is Harry Weber-Brown, chief engagement officer for ZILO Technology Ltd, which is a start-up based in London. I have previously worked for the Investing and Saving Alliance. I have much experience in both smart data, which is dealt with in part 3 of the Bill, and digital identity, which relates to digital verification services in part 2.
Phillip Mind: Good afternoon. I am Phillip Mind, director of digital technology and innovation at UK Finance, a trade body representing over 300 organisations in the bank and finance community. Like Harry, my expertise resides more in parts 2 and 3 of the Bill, although I have a few insights into part 1.
Q
Phillip Mind: The banking community is supportive of the Bill, which is enabling of a digital economy. The data protection reforms reduce compliance burdens on business, which is very welcome. The provisions on digital identity are enabling, and we see digital identity as an essential utility for customers in the future. The provisions on smart data extend an open data regime to other sectors. We already have an open banking regime, and we are keen for that to extend to other sectors. It offers real opportunities in terms of innovative products and services, but we would caution the Committee that there is significant cost and complexity in those measures.
Harry Weber-Brown: The Bill is key to retaining the UK’s place as a hub for technical innovation, and in particular for investment in fintech. It is critical also to make sure the UK remains a global leader in data portability. Building on the work that Phillip just mentioned on open banking, which has over 7 million users among both consumers and small and medium-sized enterprises, it is critical that we make sure we are ahead of the competition.
For the financial services sector, the provisions on ID help to reduce costs for things like onboarding and reduce fraud for things like authorised push payments. It also delivers a better customer experience, so you do not have to rummage around to find your passport every time you want to set up a new account or need to verify yourself to a financial service firm.
Smart data is an opportunity for us to extend ourselves as the world leader in open finance, building on the work of not only open banking but the pensions dashboard, which is yet to be launched but is another open finance scheme. The opportunity to widen up and give consumers more control in their ability to share data is critical for the customer, the economy and the financial services industry.
Q
Phillip Mind: In the banking industry we have open banking, which allows customers to choose and consent to allow an authorised third party provider access to their account to provide products and services—access to see the data. It also allows—again, with customer choice and consent—customers to allow a third party provider to make payments on their behalf. That has been hugely enabling. It has enabled growth in all sorts of innovative products and services and growth in fintech in the UK. As Harry mentioned, there are over 7 million active customers at the moment, but it does come with a cost; it is not a free good. Making that service available has involved cost and complexity.
In extending the provisions to other sectors through secondary legislation, it is really important that we are cognisant of the impacts and the unintended consequences. Many sectors have pre-existing data-sharing arrangements, many of which are commercial, and it is important that we understand the relative costs and benefits and how they fall among different participants in the market. My caution to the Committee and to Government is to go into those smart data schemes with eyes open.
Q
Phillip Mind: Clauses 62 and 64 make provision for the Secretary of State and Treasury to consult on smart data schemes. We think that those provisions could be strengthened. We see a need for impact assessments, cost-benefit analysis and full consultation. The Bill already allows for a post-implementation review, and we would advise that too.
Harry Weber-Brown: I think the other one to call out is the pensions dashboard, which has been driven out of the Money and Pensions Service. Although it has not actually launched yet, it has brought the life assurance industry on the site to develop free access to information. The consumer can see all their pensions holdings in a single place, which will then help them to make better financial decisions.
I think my former employer, the Investing and Saving Alliance, was working on an open savings, investments and pensions scheme. Obviously, that is not mandatory, but this is where the provision for secondary legislation is absolutely imperative to ensure that you get a wide scope of firms utilising this. At the moment, it is optional, but firms are still lining up and wanting to use it. There is a commitment within the financial services industry to do this, but having the legislation in place—secondary legislation, in particular—will ensure that they all do it to the same standards, both technical and data, and have a trust framework that wraps around it. That is why it is so imperative to have smart data.
Q
Harry Weber-Brown: In part 2 or part 3 of the Bill? The digital verification services or smart data?
I will come on to digital verification. Let us focus on smart data, to begin with.
Harry Weber-Brown: On that, Australia is certainly one of the leaders. The consumer has a data right under legislation that enables them to recall information from across a variety of sectors, not just financial services, and to have their information in a structured format shared with a data consumer—a third-party provider in open banking. Things are afoot. A lot of work is going on in the States, but less in Europe, interestingly. Legislation is coming through, but I think the big country to watch from our perspective is Australia and what has happened there. Theirs is a more far-reaching approach than, say, we have. That is for the smart data side.
There is a risk that if we do not extend that data right to other financial services, the consumer has a very limited view of what they can actually share. They can share their bank account details and possibly their pensions data as well, but what about their savings and investments, certainly in non-pension type wrappers? Give the consumer a full, holistic view of all their holdings and their debt as well, so that they can see their balance, as it were, and make better financial decisions. That is why we think it is so important to have part 3 of the Bill go through and for secondary legislation to follow behind it.
There is a risk that if we do not do that, the consumer has a very fragmented view. Does that mean that overseas, where it is legislated for, the consumer would have a more holistic view of everything? Would that drive investment overseas, rather than into the UK? As Phillip said, open banking has really heralded a range of fintech providers being able to consume data and provide value-added services on top of that banking data. I think it rebalances the marketplace as well.
Phillip Mind: To build on Harry’s remarks, I think that the real opportunity is for the UK to build a flourishing fintech industry. We have that already; open banking is actually one of our exports. Our way of doing open banking—the standards and the trust framework—has been a successful export, and it has been deployed in other jurisdictions. The opportunity around open data is to maintain that competitiveness for UK fintech when it is trading abroad.
Most of the consequences of extending beyond open banking into other smart data schemes impact UK businesses and consumers. I do not necessarily see that there is a competitiveness issue; it is bounded within the domestic economy.
Q
Harry Weber-Brown: That is a very good question. I did quite a lot of consumer research in my previous capacity, and consumers are initially quite sceptical, asking “Why are you asking me for identity details and things?” You have to explain fully why you are doing that. Certainly having Government support and things like the trust framework and a certification regime to make sure that the consumer knows whom they are dealing with when they are passing over sensitive data will help to build the trust to ensure that consumers will utilise this.
The second part to that is what types of services are built on top of the identity system. If I have the identity verified to an AML—anti-money laundering—standard for financial services, I could use it for a whole suite of other types of activity. That could be the purchase of age-restricted products, or sharing data with my independent financial adviser; it could reduce fraud in push payments, and so on. There is a whole suite of different types of services; you would not be using it just for onboarding. I think the Government support of this under digital verification services, part 2 of the Bill, is critical to make sure it happens.
It is opt-in. We are not saying to people that they have to get an identity card, which obviously is not hugely popular; but if we can demonstrate the value of having a digital identity, with support and trust—with the trust framework and certification with Government—we will not necessarily need to run a full marketing campaign to make sure that consumers use this.
Look at other territories—for example, Norway with Vipps, or Sweden’s BankID. I think about 98% of the population now use ID in a digital format; it is very commonplace. It is really a question of looking at the use cases—examples of how the consumer could utilise this—and making sure they receive utility and value from the setting up and the utilisation of the ID. The ID by itself is not necessarily compelling enough; the point is what you can use it for.
Phillip Mind: Trust and acceptance are key issues, and the Bill lays the legislative foundations for that. We already assert our identity digitally when we open accounts, but we do so on a one-off basis. The challenge is to go from doing so on a one-off basis to creating a digital token that is safe and secure and that allows us to reuse that digital identity. For that to work, that token has to be widely accepted, and that is a really complex strategic challenge, but the Bill lays the foundations.
We will transact digitally more and more; that is for sure. At the moment, we have a consultation, from the Treasury and the Bank of England, on a central bank digital currency. Arguably, that would benefit hugely from a reusable digital identity, but we need to be able to create the token in the right way. It could be enabling for people who have access to a smartphone but do not have a passport or driving licence; it could also build inclusion, in terms of identity. So we are very supportive of a reusable digital identity, but it is a big challenge, and the challenge is gaining trust and acceptance.
Q
Harry Weber-Brown: Financial services obviously rely heavily on data to be able to fashion their products accordingly and make them personal, so I think it is critical to have a smart data regime where everything is collected in a single format—what is known as an API, an application programming interface, which is a common way of securely sharing data.
Some of the other use cases from smart data that would benefit business would be things like sharing data around fact find. For example, if someone wants to instruct an independent financial adviser, could they not use this as a way of speeding up the process, rather than having to wait on letters of authority, which are written and take time? Similarly, with pension providers, if I wanted to move from one pension to another or to consolidate things, could we use the smart data to get an illustration of what impact that might have, so that before I ported it over I could see that?
For big financial services firms—well, for all of them—efficiencies are delivered because, as my colleague said, we are using digital as opposed to having to rely on manual processing. As long as the safeguards are put in place, that spawns a whole array of different types of use case, such as with regulatory reporting. If I need to report things to the regulator, could I use smart data provision to do that? That would benefit businesses. A lot of the financial services industry still relies on reporting on Excel spreadsheets and CSV files, so if we can digitise that, it would certainly make it a much more efficient economy.
Q
Phillip Mind: A digital identity gives customers more control. One of the issues that we face at the moment when we present a passport or driving licence is that we cannot minimise the data there. There is a data minimisation opportunity and benefit.
For businesses and customers, too, identity is a key issue when we transact digitally. There are risks around profiling, but there are real opportunities around anti-fraud as well. Being absolutely clear about who we are transacting with and being able to prove incontrovertibly who we are through a safe and secure token will deliver huge benefits to the economy.
We talked in the previous session about the undoubted benefits, which you have set out clearly. Equally, however, consumers will still want to know what sort of data about them is being used and who has access to it. For example, if a video games maker is profiling the attitudes of players to risk, in order to stimulate them with risk-and-reward opportunities within a game like Fortnite, consumers might understand how that makes their gameplay more interesting. They might consent to that, but they might not necessarily want a financial services provider to have access to that information, because it could create a picture of them that is not flattering.
Harry Weber-Brown: That is a perfectly good challenge. There is a spawning part of the industry around consent dashboards. The idea there is that we put much more control in the hands of the consumer, so that they can see where they have given consent to share data and what data has been shared, while also having the right of revocation and so on. There are technical workarounds to ensure that consumers are much more empowered to control their data. Certainly the legislation supports that, but there will be the technical implementation that sits behind it to ensure that the GDPR is abided by and that the smart data will facilitate better services to consumers. The technology is the answer, but the smart data will open up the opportunity to make sure that the consumer is protected, while with things like consent dashboards they can take better control of where their data is being shared.
Phillip Mind: The interesting thing about digital identity is that it creates a tether. In the future, you will be able to tether digitalised tokens such as securities or deeds to an identity in a safe way, but you could also tether consent to a digital identity, giving a customer or citizen a more holistic view of what they have consented to and where. As Harry says, for those who have real data literacy issues, we will see intermediaries offering services around consent. Those services exist in other jurisdictions.
I think the Estonian digital ID model works in a very similar way.
Q
Harry Weber-Brown: Part 2 of the Bill sets out the trust framework, which was being developed by the then Department for Digital, Culture, Media and Sport and which now comes under the Department for Science, Innovation and Technology. It will give certainty to the marketplace that any firm that wishes to store data—what is commonly known as an identity provider—will have to go through a certification regime. It will have to be certified against a register, which means that as a consumer I will know that I can trust that organisation because it will be following the trust framework and the policies that sit within it. That is critical.
Similarly, if we are setting up schemes with smart data we will need to make sure that the consumer is protected. That will come through in secondary legislation and the devil will be in the detail of the policies underpinning it, in a similar way to open banking and the pensions dashboard.
Further to the previous session, the other thing I would say is that we are talking on behalf of financial services, but parts 2 and 3 of the Bill also refer to other sectors: they apply equally to health, education and so on. If as a consumer I want to take more control of my data, I will want to be able to use it across multiple services and get a much more holistic view not just of my finances, but of my health information and so on.
One area that is particularly developing at the moment is the concept of self-sovereign identity, which enables me as a consumer to control my identity and take the identity provider out of the equation. I do not want to get too technical, but it involves storing my information on a blockchain and sharing my data credentials only when I need to do so—obviously it follows data minimisation. There are evolving schemes that we need to ensure the Bill caters for.
Q
You mentioned data verification services. Briefly, can you help the Committee to understand who would be providing those services and who would be paying for them? You gave the example of tethering my property or other ownership. Who would be paying in that case? Would I be paying for the rest of my life to keep that data where it is? How do you see it working?
Phillip Mind: Who will provide the services? There is already a growing list of verified providers. There is a current market in one-off digital identity services, and I think many of those providers would step in to the reusable digital identity market.
What is the commercial model? That is a really good question, and frankly at this point I do not have an answer. That will evolve, but within the frameworks that are set up—trust schemes, in the jargon—there will be those who provide digital identity services and those organisations that consume them, which could be retailers, financial services providers or banks. It is likely that the relying parties, the consumers, would pay the providers.
Harry Weber-Brown: But not the individual consumers. If you wanted to open a bank account, and the bank was relying on identity measures provided by fintech, the bank would pay the fintech to undertake those services.
We have time for a very quick question from Rupa Huq, with very quick answers.
Q
Phillip Mind: We represent more than 300 organisations in the banking and finance community. Some are big banks and some are quite small fintechs, so there is quite a spectrum.
Q
You have 30 seconds to answer.
Phillip Mind: That is a big challenge. It is really important that people are not left behind and that they have the ability to create a kind of digital identity. As a society, we will have to work very hard to enable that. That is a responsibility that falls not on banks, but on other organisations that will help citizens to create these identities.
Thank you very much indeed for your evidence this afternoon and for giving us the benefit of your time. We appreciate it.
Examination of Witness
Keith Rosser gave evidence.
Welcome, Mr Rosser. We have just 15 minutes, until 3.05 pm, for this session. Would you kindly introduce yourself to the Committee for the record?
Keith Rosser: My name is Keith Rosser. I am the chair of the Better Hiring Institute.
Q
Keith Rosser: Employers have been making hiring decisions using digital identity since 1 October, so we are a live case study. The biggest impact so far has been on the speed at which employers are able to hire staff and on the disconnection between where people live and the location of their job. For example, people in a digital identity scheme could apply for work, get a job and validate who they are without ever necessarily having to go and meet the employer. It is really important across the regions, from St Austell to Glasgow, that we are opening up job opportunities across the UK, including in some of our urban areas—West Bromwich, Barnsley and others—where people get greater job opportunities from where they live because they are not tied to where the employer is. It has had a profound effect already.
We recently looked at a study of 70,000 hires or people going through a hiring process, and 83%—some 58,000—opted to take the digital identity route. They did it in an average time of three minutes and 30 seconds. If we compare that with having to meet an employer and go through a process to provide your physical documents, there is a saving of around a week. If we think about making UK hiring the fastest globally, which is our ambition, people can start work a week earlier and pay taxes earlier, and we are cutting waiting lists and workloads. There is a huge positive impact.
In terms of employers making those hiring decisions, technology is so much better than people at identifying whether a document is genuine and the person is who they say they are. In that case study, we found that 200 of the 70,000 people going through the process had fake documents or fraudulently obtained genuine documents. The question is, would the human eye have spotted that prior to the implementation of digital identity? I am certain that it would not have done. Digital identity is really driving the potential for UK hiring to be a shining example globally.
Q
Keith Rosser: From that 70,000 example, we have not seen evidence yet that public trust has been negatively impacted. There are some very important provisions in the Bill that have to go a long way to assuring that. One is the creation of a governance body, which we think is hugely important. There has to be a monitoring of standards within the market. It also introduces the idea of certifying companies in the market. That is key, because in this market right now 30% of DVSs—nearly one in three companies—are not certified. The provision to introduce certification is another big, important move forward.
We also found, through a survey, that we had about 25% fewer objections when a user, company or employer was working with a certified company. Those are two really important points. In terms of the provision on improving the fraud response, we think there is a real opportunity to improve what DVSs do to tackle fraud, which I will probably talk about later.
Q
Keith Rosser: I have every reason to believe that organisations not certified will not be meeting anywhere near the standards that they should be meeting under a certified scheme. That appears really clear. They certainly will not be doing as much as they need to do to tackle fraud.
My caveat here is that across the entire market, even the certified market, I think that there is a real need for us to do more to make sure that those companies are doing far more to tackle fraud, share data and work with Government. I would say that uncertified is a greater risk, certainly, but even with certified companies we must do more to make sure that they are pushed to meet the highest possible standards.
Q
Keith Rosser: Yes. The requirement on DVSs to tackle fraud should be higher than it currently is.
Q
Keith Rosser: Absolutely. I will give a quick example relating to the Online Safety Bill and hiring, which I am talking about. If you look at people getting work online by applying through job boards or platforms, that is an uncertified, unregulated space. Ofcom recently did research, ahead of the Online Safety Bill, that found that 30% of UK adults have experienced employment scams when applying for work online, which has a major impact on access to and participation in the labour market, for many reasons.
Turning the question the other way around, we can also use that example to show that where we do have uncertified spaces, the risks are huge, and we are seeing the evidence of that. Specifically, yes, I would expect the governance body or the certification regime, or both, to really put a requirement on DVSs to do all the things you said—to have better upstream processes and better technology.
Also, I think there is a big missing space, given that we have been live with this in hiring for eight months, to provide better information to the public. At the moment, if I am a member of the public applying for a job and I need to use my digital identity, there is no information for me to look at, unless the employer—the end user—is providing me with something up front. Many do not, so I go through this process without any information about what I am doing. It is a real missed opportunity so far, but now we can right that to make sure that DVSs are providing at least basic information to the public about what to do, what not to do, what questions to ask and where to get help.
Q
Keith Rosser: Those are several really good questions. I will use an example about location from the other perspective, first of all. At the moment, Home Office policy has not caught up with digital identity, and we are addressing that. There is a real opportunity to right that. It means that one in five work seekers right now cannot use digital identity to get a job, because they do not have an in-date British or Irish passport. If you have a visa or an in-date British or Irish passport, that is fine, but if you are among the one in five people in the country who do not have an in-date passport, you cannot. Those people have to visit the premises of the employer face to face to show their documents, or post their original documents across the UK.
This has really created a second-class work seeker. There are real dangers here, such as that an employer might decide to choose person one because they can hire them a week faster than person two. There is a real issue about this location problem. Digital identity could sever location to allow people more opportunities to work remotely across the UK.
There were really good questions about other information. The Bill has a provision for other data sharing. Again, there is the potential and the opportunity here to make UK hiring the fastest globally by linking other datasets such as HMRC payroll data. Rather than looking at a CV and wondering whether the person really worked in those places, the HMRC data could just confirm that they were employed by those companies.
There is a real opportunity to speed up the verification but, as I want to acknowledge and as you have referred to, there is certainly also a risk. Part of our mission is to make UK hiring fairer, not just faster and safer. I want to caution against going to a degree of artificial intelligence algorithmic-based hiring, where someone is not actually ever in front of a human, whether by Teams video or in person, and a robot is basically assessing their suitability for a job. We have those risks and would have them anyway without this Bill. It is really important as we go forward that we make sure we build in provisions somewhere to ensure that hiring remains a human-on-human activity in some respects, not a completely AI-based process.
Mr Rosser, thank you very much indeed for your evidence this afternoon. We are grateful for your time, sir.
Examination of Witnesses
Helen Hitching and Aimee Reed gave evidence.
Welcome, ladies. We have until 3.30 pm for this session. Will the witnesses please be kind enough to introduce themselves to the Committee for the record? Let us start with Helen Hitching.
Helen Hitching: Good afternoon. I am Helen Hitching, Chief Data Officer for the National Crime Agency, and this is my first time in front of a Committee.
Welcome and thank you. Aimee Reed?
Aimee Reed: Hello, everybody. This is also my first appearance in front of a Bill Committee. I am the Director of Data at the Metropolitan Police Service. For my sins, I also volunteer to lead all 43 forces on data; I am chair of the national police data board. I am here today in that capacity as well.
Q
Aimee Reed: It is a big requirement across all 43 forces, largely because, as I am sure you are aware, we are operating on various aged systems. Many of the technology systems across the policing sector do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches and release of information that they deliver. So the requirement is a considerable burden across all the forces.
Q
“detecting, investigating or preventing crime”,
to quote the new definition, aid the tackling of serious crime in the UK?
Helen Hitching: Sorry—could you repeat that?
Sure. My understanding of the legislation in front of us is that if the Bill becomes law,
“detecting, investigating or preventing crime”
will be listed as a recognised legitimate interest and therefore be subject to separate, or slightly amended, data rules. How will that change help tackle serious crime in the UK?
Helen Hitching: I think it will bring a level of simplicity across the data protection environment and make sure that we can share data with our policing colleagues and other services in a more appropriate way. It will make the whole environment less complex.
Q
Helen Hitching: Yes, it will aid it. Again, it brings in the ability to put the data protection framework on the same level, so we can share data in an easier fashion and make it less complex.
Q
Helen Hitching: The agency does not believe that those safeguards will be lowered. We will still not be able to share data internationally with countries that do not have the same standards that are met by the UK. It will provide greater clarity about which regimes should be used and at which point. The standards will not reduce.
Q
Helen Hitching: The agency has had to undertake a test to make sure that there is adequate or, essentially, equivalent protection. That standard is now changing to “not materially lower”, so it will be a lot easier to understand where those protection levels are the same as or not materially lower than the UK’s. It will be simplified a lot.
Q
Aimee Reed: Policing thinks that that will significantly simplify things. It will not reduce the level of oversight and scrutiny that will be placed upon us, which is the right thing to do. In terms of the simplicity of that and the regimes that we are under, we are very supportive of that change.
Helen Hitching: Likewise, we are supportive and welcome the simplification. We do note, however, that the Biometrics Commissioner currently has a keen focus on developing technology in a legal manner and consults with the public. We would ask that there remains a focus on that oversight of biometrics, to assure the public that that work remains a priority once the regulation of biometrics transfers to the Information Commissioner’s Office and to make sure that that focus is retained.
Q
Aimee Reed: On balance, it will make things easier. We are retaining the very different sections of the Act under which different organisations operate, and the sections that look to improve joint working across part 3 and part 4 agencies are very welcome. At the moment that is not about simplifying the relationships between those in, say, part 2 and part 3, albeit data sharing is entirely possible. In essence, it is going to get simpler and easier to share data, but without losing any of the safeguards.
Q
Aimee Reed: It is not as easy as we would like it to be, and provision is not made in the Bill to make that easier. There are some discussions about it going into the Online Safety Bill and other areas. It could be easier. We would push harder in the future, but at the moment, getting parity across the other areas and around national security is a focus that we welcome.
Helen Hitching: I want to pick up on the fact that safeguards are not reducing. It is key that the agency notes the point that our safeguards are not being lowered because of this.
Q
Aimee Reed: I will answer that in respect of where we are now in national policing. It would be of considerable benefit if the guidance was clearer that we could share information without having to redact it, certainly pre-charge, to enable better and easier charging decisions—to be honest—within the Crown Prosecution Service. It would also reduce the current burden on officers: you can think about the volume of data they have to hand over, and it can be video, audio, transcripts—it is not just witness statements, as it used to be 20 or 30 years ago. Reducing that burden would be significant for frontline officers and unleash them to be able to do other things.
Q
Aimee Reed: It certainly would. It is not that we cannot do that now; I just think the guidance could be clearer. It would put it into sharper relief if we could release that burden from policing to the CPS and the CPS felt confident that that was within the rules.
Helen Hitching: The agency agrees with that—there would be the same impact.
Q
Aimee Reed: It is not so much about specific datasets; it is about synchronisation and the speed with which you can exchange data that enables you to make better decisions. Because the Data Protection Act is split into three parts, and law enforcement quite rightly has a section all of its own, you cannot utilise data analytics across each of the parts. Does that make sense? If we wanted to do something with Driver and Vehicle Licensing Agency data and automatic number plate recognition data, we could not join together those two large datasets to enable mass analysis because there would be privacy rights considerations. If want to search datasets from other parts of that Act, we have to do that in quite a convoluted administrative way that perhaps we can share within law enforcement. It is more about the speed of exchange.
Q
Aimee Reed: It is more with our local partners. I am sure that our partners would say they are equally frustrated by the speed at which they can get data from the police in large datasets to enable them to make better decisions in their local authorities. That is just how that Act was constructed, and it will remain so. The recent ICO guidance on sharing has made that simpler, but this realm of the Bill will not make that synchronisation available to us.
Q
Aimee Reed: It is about getting right the balance between what we do with people’s personal data and how the public would perceive the use of that data. If we just had a huge pot where we put everybody’s data, there would be real concerns about that. I am not suggesting for a second that the police want a huge pot of everybody’s data, but that is where you have to get the balance right between knowing what you have and sharing it for the right purpose and for the reason you collected it in the first place.
Q
Helen Hitching: Sorry—could you repeat that?
Has the balance between sharing and the regulation of biometric data, particularly facial recognition data, been struck in the right way?
Helen Hitching: I do not think facial recognition data is captured.
Aimee Reed: On facial recognition, given that we have deployed it—very high profile—I think that the balance is right. We have learned a lot from the South Wales judgment and from our own technical deployments. The Bill will also highlight how other biometric data should be managed, creating parity and an environment where biometric data that we do not yet have access to or use of is future-proofed in the legislation. That is really welcome.
Q
Helen Hitching: It is difficult for the agency to comment on another organisation’s resources and capabilities. That question should probably be posed directly to them. The Information Commissioner’s Office already deploys resources on issues related to law enforcement data processing, including the publication of guidance. From a biometrics perspective, the casework is moving to the IPC, so from a resourcing perspective I think it would have adequate casework provision and expertise.
Aimee Reed: I echo the comments about expertise, particularly of the Investigatory Powers Commissioner. I think that the expertise exists but, like Helen, whether it has enough resources to cope with the casework I presume is a demand assessment that it will do in response to the Bill.
Q
Aimee Reed: That is a very topical question today. The first thing to say is that I am not sure I agree that this is a large expansion of our access to personal data; I think it is a simplification of the understanding of what we can do as a law enforcement body. All the same safeguards and all the same clear water will be in place between the different parts of the Act.
We did indeed get a “limited” rating on records management, but as I am sure you are aware, we were assessed on three areas, and we got the second highest grading in the other two: the governance and accountability of our management data; and our information risk management. They came out higher.
What have we done since 2021? We have done quite a lot to improve the physical and digital records management, with greater focus on understanding what data we hold and whether we should still hold it, starting a review, retain and deletion regime. We now have an information asset register and a ROPA—record of processing activities. The previous commissioner, Cressida Dick, invested a significant amount in data management and a data office, the first in UK policing. The new commissioner, as I am sure you have seen, is very committed to putting data at the heart of his mission, too. We have already done quite a lot.
The Bill will simplify how we are able to talk to the public about what we are doing with their data, while also reassuring them about how we use it. We are in a very different place from where we were 12 months ago; in another 12 months, it will be even more significantly improved. We have just worked with the Open Data Institute to improve how open we will be with our data to the public and partners in future, giving more to enable them to hold us to account. I am already confident that we would not get a rating like that again in records management, just based on the year’s review we have had from the ICO about where we have got to.
Q
Aimee Reed: I wish I had authority across them. I represent—that is a better way of describing what I do. Am I confident that law enforcement in general has the right investment in this space, across all forces? No, I am not. That is what I am working hard to build with Chief Constable Jo Farrell, who leads in this area for all forces on the DDaT approach. Am I more confident that forces really getting investment in this space is necessary? Absolutely.
Q
Aimee Reed: In line with our own DDaT framework, we are working with the Home Office and other ministerial bodies on what good looks like and how much is enough. I am not sure that anybody has the answer to that question yet, but we are certainly working on it with the Home Office.
Ladies, thank you very much indeed for your time this afternoon. We will let you get back to your crime fighting.
Examination of Witnesses
Andrew Pakes and Mary Towers gave evidence.
We now come to our ninth panel. We welcome Andrew Pakes, who is director of communications and research at Prospect, and Mary Towers, who is the policy officer at the Trades Union Congress. We have until 3.55 for this session. I invite the witnesses to introduce themselves to the Committee for the record—ladies first.
Mary Towers: Hi, and thanks very much for inviting the TUC to give evidence today. My name is Mary Towers. I am an employment rights policy officer at the TUC, and I have been leading a project at the TUC looking at the use of AI in the employment relationship for the past couple of years.
Andrew Pakes: Hello, everyone. Thank you for inviting Prospect to give evidence today. My name is Andrew Pakes. I am one of the deputy general secretaries and the research lead for Prospect union, which represents scientific, technical and professional workers. I am also a member of the OECD’s AI expert panel, representing trade unions.
Q
Andrew Pakes: We were already seeing a huge change in the use of digital technology prior to the pandemic. The pandemic itself, not least through all the means that have kept many of us working from home, has transformed that. Our approach as a trade union is to embrace technology. We believe that our economy and the jobs our members do can be made better and more productive through the good deployment of technology to improve jobs.
We also think there is a downside to it all. Everything that needs to be risked and balanced is in that. Alongside the advance in innovation and technology that has brought benefits to the UK, we have seen a rise in the darker or less savoury side of that, which is namely the rise of surveillance software; the ability of software to follow us, including while working from home, and to micromanage us and track people; and the use of technology in performance management—the so-called people analytics or HR management, which is largely an unregulated area.
If you ask me which legislation this should sit in, I would probably say an employment-type Bill, but this is the legislation we have and the Government’s choice. We would definitely like to see checks and balances at least retained in the new legislation compared with GDPR, but maybe they should be enhanced to ensure that there is some form of social partnership and that working people have a say over how technology is introduced and implemented in their workspaces.
Q
Andrew Pakes: There is increasing evidence that while technology has allowed many of us to remain connected to our workspaces—many of us can now take our work anywhere—the downside is that our work can follow us everywhere. It is about the balance of digital disconnection and the ability to switch off from work. I am probably preaching to the wrong crowd, because MPs are constantly on their phones and other technology, but many of us are able to put that away, or should do, because we are contracted workers and have a different relationship with our workplace in terms of how that balance is struck. We very much focus on wellbeing and on information and consultation, ensuring that people are aware of the information that is collected on us.
One of the troubling factors that we and the TUC have picked up is that consistently, in opinion polls and research that is done, working people do not have confidence or knowledge about what level of data is being collected and used on them. When we see the increasing power of technology through AI and automated decisions, anxiety in the workplace is best foiled by transparency, in the first place, and, we would obviously argue, a level of social partnership and negotiation over how technology is introduced.
Q
Andrew Pakes: Absolutely. What strikes me about the legislation you are considering is that just about all our major competitors—who are more productive and more advanced, often in innovation, including the United States—are choosing a path of greater scrutiny and accountability for AI and automated decision making. There is a concern that in this legislation we are taking an alternative path that makes us stand out in the international economy, which is about diluting existing protections we have within GDPR to a lower level. That raises concerns.
We have particular concerns about automated technology, but also about the clauses on the reduction of powers around data protection impact assessments. We think the risk is that the legislation could open the back door to the increase in dodgy surveillance and other forms of software coming into the UK market. I am worried about that for two reasons: first, because of the impact it has on individual workers and what is happening there; and secondly, because most of this technology—we have been part of a project that has tracked over 500 different surveillance software products currently on the international market—is designed largely for a US or Chinese market, with little knowledge of how it is being done.
What we know through ensuring consultation on the existing DPIA arrangements is that there is a break in the current rules that enables or ensures that employers have a consultation and check where their products are taking their data from and what they have stored. Diluting that risks ensuring that we are not sure where that data is being used and we are not sure of the power of this technology, and working people then end up with a worse deal than they currently have.
Q
Mary Towers: On the contrary, we would say that the Bill in fact reduces the collective rights of workers, particularly in relation to data protection impact assessments. As Andrew has mentioned, at the moment the right to a data protection impact assessment involves an obligation on an employer to consult with workers or their representatives. That is an absolutely key tool for trade unions to ensure that worker voice is represented in the path of the introduction of new technologies at work. Also, at the moment, missing from the Bill is the ability of trade unions to act as representatives for data subjects in a collective way. We say that that, too, is missing, could be added and would be an important role that unions could take on.
Another aspect missing from the Bill, which we say is a hugely missed opportunity, is a potential right that workers could have to have an equal right to their data that matches the right employers have over worker data. Once workers had that right, they could then collectivise their own data, which would enable them, for example, to pick up on any discriminatory patterns at work or pick up any problems with equal pay or the gender pay gap. We say that that right to collectivise data and redress the imbalance of power over data at work is really important.
The Bill misses entirely the opportunity to introduce those kinds of concepts, which are actually vital in the modern workplace, where data is everything. Data is about control; data is about influence; data is the route that workers have to establish fair conditions at work. Without that influence and control, there is a risk that only one set of interests is represented through the use of technology at work, and that technology at work, rather than being used to improve the world of work, is used to intensify work to an unsustainable level.
Q
Mary Towers: Yes. This is something that Andrew’s union, Prospect, has been really active in. It has produced some absolutely brilliant guidance that looks in detail at the importance of the process of data protection impact assessments and rolled out training for its trade union reps. Again, several of our other affiliates have undertaken that really important work, which is then being rolled out into the workplace to enable reps to make good use of that process.
I will, however, add the caveat that I understand from our affiliates that there is a very low level of awareness among employers about that obligation, about the importance of that process and about exactly what it involves. So a really important piece of awareness-raising work needs to be done there. We say it is vital to build on the existing rights in the UK GDPR, not dilute or remove them.
Q
Andrew Pakes: We would assert that under the law of GDPR, high risk in the legislation is, I think, in recital 39. I will correct that if I picked the wrong one. It talks about high risk as being decisions that can make material or non-material impact on people. If we now have software and algorithms or automated decisions that can hire and fire us—we have examples of that—and can decide who deserves a promotion or who can be disciplined, if that information can now be used to track individuals and decide whether someone is a good or bad worker, we would assert that that is a high risk. Anything that can actually affect both your standing in your workspace or your contractual relationship, which is essentially what employment is, or which has an impact on the trust and confidence the employer has in you and, equally, your trust and confidence back in the employer, that is a very clear definition of high risk.
What is important about the existing UK GDPR is that it recognises the nature of high risk but, secondarily, it recognises that data subjects themselves must be consulted and involved either directly or, where that is not practicable, through their representatives. Our worry is that the legislation that is tabled now dilutes that and opens up risk to bad practice.
Q
Mary Towers: The right to a data subject access request—again, like the DPIAs—is an absolutely crucial tool for trade unions in terms of establishing transparency over how their data is being used. Really, it provides a route for workers and unions to get information about what is going on in the workplace, how technologies operate and how they are operating in relation to individuals. It is an vital tool for trade unions.
What we are concerned about is that the new test specified in the Bill will provide employers with very broad discretion to decide when they do not have to comply with a data subject access request. The use of the term “vexatious or excessive” is a potential barrier to providing the right to an access request and provides employers with a lot of scope to say, for example, “Well, look, you have made a request several times. Now, we are going to say no.” However, there may be perfectly valid reasons why a worker might make several data subject access requests in a row. One set of information that is revealed may then lead a worker to conclude that they need to make a different type of access request.
We say that it is really vital to preserve and protect the right for workers to access information. Transparency as a principle is something that, again, goes to really important issues. For example, if there is discriminatory operation of a technology at work, how does a worker get information about that technology and about how the algorithm is operating? Data subject access requests are a key way of doing that.
Q
Andrew Pakes: “If we get this right” is doing a lot of heavy lifting there; I will leave it to Members to decide the balance. That should be the goal. There is a wonderful phrase from the Swedish trade union movement that I have cited before: “Workers should not be scared of the new machines; they should be scared of the old ones.” There are no jobs, there is no prosperity and there is no future for the kind of society that our members want Britain to be that does not involve innovation and the use of new technology.
The speed at which technology is now changing and the power of this technology compared with previous periods of economic change make us believe that there has to be a good, robust discussion about the balances of checks and balances in the process. We have seen in larger society—whether through A-level results, the Post Office or other things—that the detriment is significant on the individuals impacted if legislators get that balance wrong. I agree with the big principle and I will leave you to debate that, but we would certainly urge that checks and balances need to be balanced, not one-sided.
Mary Towers: Why does respect for fundamental rights have to be in direct conflict with growth and innovation? There is not necessarily any conflict there. Indeed, in a workplace where people are respected, have dignity at work and are working in a healthy way, that can only be beneficial for productivity and growth.
Q
Andrew Pakes: That is the first base. The power of technology is changing so quickly, and the informal conversations we have every day with employers suggest that many of them are wrestling with the same questions that we are. If we get this legislation right, it is a win-win when it comes to the question of how we introduce technology in workspaces.
You are right to identify the changing nature of work. We would also identify people analytics, or the use of digital technology to manage people. How we get that right is about the balance: how do you do it without micromanaging, without invading privacy, without using technology to make decisions without—this is a horrible phrase, but it is essentially about accountability—humans in the loop? Good legislation in this area should promote innovation, but it should also have due regard to balancing how you manage risks and reduce harms. That is the element that we want to make sure comes through in the legislation in its final form.
Q
Andrew Pakes: Absolutely. Let me give you a quick example of one piece of technology that we have negotiated in some areas: GPS tracking. It might be old technology, compared with many things that you are looking at. We represent frontline workers who often work alone, outside, or in spaces where their work could be risky. If those people cannot answer their radio or phone, it is in the legitimate interests of all of us to see where they are, in case they have had an accident or are in a dangerous situation. We can see a purpose to that technology. In negotiation with employers, we have often said, “This is good technology for keeping people safe, but we are not happy with it being used in performance reviews.” We are not happy with people saying, “I am sorry, Mr Collins, but you seem to spend a lot of time in the same café each lunch time.”
The issue is not the technology, but its application. Technology that is used to increase safety is very good, but the risk is that it will be used to performance-manage people; employers may say, “You are not doing enough visits,” “You aren’t working fast enough,” or, “You don’t drive fast enough between jobs.” We need balance and control, as opposed to ruling out technology that can keep people safe and well.
Q
Andrew Pakes: From my perspective, yes.
Mary Towers: The TUC has red lines relating to the use of these types of technologies. One is that we simply should not have technologies at work that are not transparent and that operate in a way that people do not understand. The principle of explainability is really important to us. People need to understand when the technologies are operating, and how they operate in relation to them. On top of that, it is absolutely vital that discriminatory data processing does not take place. The example that you gave from the gig economy is potentially of a discriminatory pay calculation—of an algorithm that might be calculating different rates of pay for individuals who are carrying out exactly the same work. The algorithm is potentially replicating existing inequalities in pay that are rooted in gender or race.
Q
Mary Towers: Yes. Drivers are a good example. People drive a certain distance to pick people up or deliver items. Even when the driving time is exactly the same, people may be paid different rates, because the algorithm will have worked out how long certain groups of people are likely to wait before they accept a gig, for example. I emphasise that, in our view, those sorts of issues are not restricted to the gig economy; they spread way beyond it, into what one might consider to be the far more traditional professions. That is where our red lines are. They relate to transparency, explainability, non-discrimination and, critically, worker and union involvement at each stage of the AI value chain, including in the development of that type of app—you mentioned development. Unless the worker voice is heard at development stage, the likelihood is that worker concerns, needs and interests will not be met by the technology. It is a vital principle to us that there be involvement of workers and unions at each stage of the AI value chain—in development, application and use.
Q
The Minister talked about the need for growth, which has been sadly lacking in our economy for the last 13 years. Obviously, technology can make huge improvements to productivity for those in the workforce. Mr Pakes, as someone whose members are involved in technology, scientific and IT organisations, I wonder whether you would agree with this, which comes from my experience in the diffusion of technology. Is it possible to get the best from technology in an organisation or company without the people who will be using it, or the people on whom it will be used, being an active part of that diffusion of technology, and understanding and participating in its use?
Andrew Pakes: Absolutely. That has always been how productivity has improved or changed, in effect, the shop floor. If you are asking, “What problems are you using technology to solve?”, it may well be a question better asked by the people delivering the product or service than necessarily the vendor selling the software, whether that is old or new technology. I encourage the Committee to look at the strong evidence among our competitors who rate higher, in terms of productivity and innovation, than the UK, where higher levels of automation in the economy are matched by higher levels of worker participation. Unions are the most common form, but often it can be works councils or small businesses in terms of co-design and collaboration. We see that social partnership model of the doers, who identify and solve problems, being the people who do that.
We have good examples. We represent members in the nuclear sector who are involved in fusion, small modular reactors or other technology, where the employer-union relationship is critical to the UK’s intellectual property and the drive to make those successful industries. In the motor industry and other places where the UK has been successful, we can see that that sense of social partnership has been there. We have examples around using AI or the monitoring of conversations or voices. Again, I mentioned GPS tracking, but in safety-critical environments, where our members want to be kept safe, they know that technology can help them. Having that conversation between the workforce and the employer can come up with a solution that is not only good for our members, because they stay safe and understand what the safety regime is, but good for the employer, because days are not lost through illness or accidents. For me, that sense of using legislation like this to underpin good work conversations in the data setting is what the mission of this Bill should be about.
Q
Andrew Pakes: We think there should be a higher bar, because of the contractual nature. Whether it is self-employed workers contracting for a piece of work or an employment relationship, there is a fundamental difference in our view between my individual choice to go online and enter my data into a shop, because I want to be kept appraised of when the latest product is coming out—it is my free choice to do that—and my being able to consent in an employment relationship about how my data is used. As Mary said, the foundation stone has to be transparency on information in the first place. Beyond that, there should be negotiation to understand how that data is used.
The critical point for us is that most companies in the UK are not of a size where they will be developing their own AI products—very few will be; we can probably name a couple of them. Most companies using automated decisions or AI will be purchasing that from a global marketplace. We hope many of them will be within certain settings, but we know that the leaders in this tend to be the Chinese market and the US market, where they have different standards and a range of other things. Ensuring that we have UK legislation that protects that level of consent and that redresses that power balance between workers and employers is a critical foundation to ensuring that we get this right at an enterprise level.
Q
Andrew Pakes: We would like to see more. We are worried that the current legislation, because of things such as DPIAs, drops that level of standards, which means that the UK could end up trading on a lower standard than other countries, and that worries us.
Mary Towers: We are also concerned about the change to the test for international data transfers, which might make the requirements less restrictive. There is a change from adequacy to a more risk-based assessment process in terms of international data transfers. Again, we have very similar concerns to Andrew about the use of technologies rooted in international companies and the inevitable international transfers of data, and workers essentially losing control over and knowledge of what is happening with their data beyond the workplace.
In addition, I would also like to make a point about the importance of transparency of source code, and the importance of ensuring that international trade deals do not restrict that transparency, meaning that workers cannot access information about source code once data and AI-powered tools are rooted in other countries.
Q
Mary Towers: I will give my statistics very quickly. Our polling revealed that approximately 60% of workers perceived that some form of monitoring was taking place in their workplace. The CEO of IBM told Bloomberg last week that 30% of non-customer facing roles, including HR functions, could be replaced by AI and automation in the next five years.
A recent report from the European Commission’s Joint Research Centre—the “Science for Policy” report on the platformatisation of work—found that 20% of German people and 35% of Spanish people are subject to algorithmic management systems at the moment. Although that is obviously not UK-based, it gives you a very recent insight on the extent of algorithmic management across Europe.
Andrew Pakes: And that matches our data. Around a third of our members say that they are subject to some form of digital monitoring or tracking. That has grown, particularly with the rise of hybrid and flexible working, which we are in favour of. This is a problem we wish to solve, rather than something to stop, in terms of getting it right.
Over the past two years, we have increasingly seen people being performance managed or disciplined based on data collected from them, whether that is from checking in and out of buildings, their use of emails, or not being in the right place based on tracking software. None of the balances we want should restrict the legitimate right of managers to manage, but there needs to be a balance within that. We know that using this software incorrectly can micromanage people in a way that is bad for their wellbeing.
The big international example, which I will give very quickly, is that if you look at a product like Microsoft—a global product—employers will buy it. My work computer has Office 365 on it. Employers get it on day one. The trouble with these big products is that, over time, they add new products and services. There was an example where Microsoft did bring in a productivity score, which could tell managers how productive and busy their teams were. They rowed back on that, but we know that with these big, global software projects—this is the point of DPIAs—it is not just a matter of consultation on day one.
The importance of DPIAs is that they stipulate that there must be regular reviews, because we know that the power of this technology transforms quickly. The danger is that we make life miserable for people who are good, productive workers and cause more problems for employers. It would be better for all of us to solve it through good legislation than to arm up the lawyers and solve it through the courts.
I am afraid that we are subject to chronological monitoring, so we must bring this session to an end. I thank our two representatives very much indeed for their evidence this afternoon; we are grateful for your time. We will now move on to our 10th panel.
Examination of Witnesses
Alexandra Sinclair, Ms Laura Irvine and Jacob Smith gave evidence.
Welcome to the witnesses in our 10th panel. Thank you for your time this afternoon. We will hear from Alexandra Sinclair, a research fellow at the Public Law Project; Laura Irvine, via Zoom, the convener of the privacy law sub-committee at the Law Society of Scotland; and Jacob Smith, the UK accountability team leader at Rights and Security International. We have until 4.25 pm for this session. Would the witnesses please be kind enough to introduce themselves to the Committee for the record, starting with those in the room?
Alexandra Sinclair: Thank you to the Committee for inviting me. My name is Alexandra Sinclair and I am a research fellow at the Public Law Project. The Public Law Project is an access to justice charity. We help people to seek redress for unfair or unlawful decisions made by public authorities. I am also a doctoral researcher at the London School of Economics where my research focuses on automated decision making.
Jacob Smith: My name is Jacob Smith. I am the UK accountability team leader at Rights and Security International, a London-based charity aimed at the intersection between national security and human rights, which tries to ensure that when Governments take pledges in the name of national security, they comply with human rights. I am also an associate lecturer in international law, privacy and data governance at the University of Surrey.
Ms Irvine: I am Laura Irvine. I am the convener of the privacy law sub-committee at the Law Society of Scotland. My day job is head of regulatory law at Davidson Chalmers Stewart—a Scotland-based law firm. I have been working in the field of data protection law for the past 10 years, so pre-GDPR and obviously, more recently, in a post-GDPR world.
Q
Alexandra Sinclair: Thank you for the question. In order for the public to have trust and buy-in to these systems overall, so that they can benefit from them, they have to believe that their data is being used fairly and lawfully. That requires knowing which criteria are being used when making a decision, whether those criteria are relevant, and whether they are discriminatory or not. The first step to accountability is always transparency. You can know a decision is fair or lawful only if you know how the decision was made in the first place.
Q
Alexandra Sinclair: Currently the Government have their algorithmic reporting transparency standard—I think I have got that right; they keep changing the acronym. Currently on that system there are about six reports of the use of automated decision-making technology in government. The Public Law Project decided to create a parallel register of the evidence that we could find for automated decision making in government. Our register includes over 40 systems in use right now that involve partly automated decisions about people. It would be great if the Government themselves were providing that information.
Q
“There are clear benefits to organisations, individuals and society in explaining algorithmic decision-making”
in the public sector. Do you think that measures in the Bill achieve that? Do they unlock benefits and explain the Government’s algorithmic decision making to the public?
Alexandra Sinclair: No, and I think they do not do that for three reasons, if I have the time to get into this. The changes to subject access requests, to data protection impact assessments and to the prohibition on article 22 are the key issues that we see. The reason why we are particularly worried about subject access requests and data protection impact assessments is that they are the transparency provisions. They are how you find out information about what is happening. A subject access request is how you realise any other right in the Bill. You can only figure out if an error has been made about your data, or object to your data, if you know how your data is being used in the first place.
What we are worried about with the Bill is that you currently have an almost presumptive right to your data under a subject access request, but the change in the Bill changes the standard from the current “manifestly unfounded or excessive” to “vexatious or excessive”. It also gives a whole load of factors that data controllers are now allowed to take into account when declining your request for your own data. Furthermore, under the proposal in the Bill they do not have to give you the reason why they declined your request for the data. We think that is really problematic for individuals. You have got this information asymmetry there, and it is going to be really difficult for you to prove that your request was not vexatious or excessive if you do not even know why it was denied in the first place.
If we think about some examples that we have been talking about in Committee today, in a lot of the Uber and Ola-led litigation, where individuals were able to show that their employment rights had been unfairly treated, they were able to find out about that through subject access requests. Another example is the London Met police’s gangs matrix. The Information Commissioner’s Office did a review of that matrix and found that the system did not even clearly distinguish between victims and perpetrators of crime, and the only way for individuals to access the matrix and check if the information held on them is accurate is through a subject access request. That is our first concern with the Bill.
Our second concern is the changes to data protection impact assessments. The first thing to note is that they already have to apply only in high-risk processing situations, so we do not think that they are an undue or onerous burden on data controllers because they are already confined in their scope. What a data protection impact assessment does—this is what we think is beneficial about it—is not to be a brake on processing, but to force data controllers to think though the consequences of processing operations. It asks data controllers to think, “Where is that data coming from? What is the data source? Where is that data being trained? For what purpose is that data being used?” The new proposal under the Bill for data protection impact assessments significantly waters down those obligations and means that, essentially, the only requirement is accounting for the purposes for the data. So instead of explaining how the data is being used, you are only requiring that purpose.
We think that has two problems. First, data controllers will not be thinking through all the harms and consequences before they deploy a system. Secondly, if individuals affected by those systems want to get information about how their data was processed and what happened, there will be a lot less information on that impact assessment for them to assess the lawfulness of that processing.
My final critique of the Bill is this. We would say that the UK is world-leading in terms of article 22—other states are certainly looking to the UK—and it is a strange time to be looking to roll back protections. I do not know if Committee members have heard about how Australia recently experienced the Robodebt scandal, on which there is a royal commission at the moment. In that case, the system was a solely automated debt discrepancy system that ended up making over 500,000 incorrect decisions, telling people that they had committed benefit fraud when they had not. Australia is having to pay millions of dollars in compensation to those individuals and to deal with the human cost of that decision. The conversation in Australia right now is, “Maybe we should have article 22. Maybe this wouldn’t have happened if we had had a prohibition on solely automated decision making.” When other states are looking to beef up their AI protections, we need to think carefully about looking to roll them back.
Q
Jacob, what measures do you think should be in place to ensure that data protection legislation balances the need to protect national security with the need to uphold human rights? Does the Bill strike the right balance?
Jacob Smith: Thanks for the question. To take the second part first, we argue that the Bill does not strike the right balance between protecting national security and upholding data and privacy rights. We have three main concerns with how the Bill sets out that balance at the moment, and they come from clauses 24 to 26.
We have this altered regime of national security certificates for when law enforcement is taking measures in the name of national security, and we have this new regime of derogation notices. When law enforcement and the security services are collaborating, the notices allow the law enforcement body working in that collaboration to benefit from the more relaxed rules that are generally only for the intelligence services.
From our perspective, there are three main concerns. First, we are not quite sure why these amendments are necessary. Under human rights law, for an interference with somebody’s data or privacy rights to be lawful, it needs to be necessary, and that is quite a high standard. It is not something akin to it being more convenient for us to have access to this data, or more efficient for us to have access to this data; it has to meet a high standard of strict necessity. Looking through the Second Reading debate, the impact assessment and the European convention on human rights analysis, there is no reference to anything that would be akin to necessity. It is all, “It would be easier for law enforcement to have these extra powers. It would be easier if law enforcement were potentially able to use people’s personal data in more ways than they are at the moment.” But that is not the necessity standard.
The second concern is the lack of safeguards in the Bill. Another thing that human rights law—particularly article 8 of the ECHR—focuses on is the necessity of introducing additional safeguards to prevent the misuse of legislation that allows public bodies to interfere with people’s privacy rights. At the moment, as the Bill sets out, we have very weak safeguards when both national security certificates and designation notices are in place. At the moment, there is an opportunity, at least on the face of the Bill, for both those measures to be challenged before the courts. However, the issue here is that the Secretary of State has almost a monopoly over deciding whether those notices and certificates get published. So yes, although on the face of the Bill an individual may be able to challenge a national security certificate or a designation notice that has impacted them in some way, in practice they will not be able to do that if they do not know that it exists.
Finally, one encompassing issue is the expansive powers for the Secretary of State. One thing that we advocate is increased independent oversight. In the Bill, the Secretary of State has an extremely broad role in authorising law enforcement bodies to process personal data in a way that would otherwise be unlawful and go further than the existing regimes under the Data Protection Act 2018. Those are our three broad concerns in that regard. Ultimately, we do not see that the right balance has been made.
Q
Ms Irvine: We have concerns about the proposed changes and their potential impact on the independence of the Information Commissioner. I was able to listen to John Edwards speaking this morning, and I noted that he did not share those concerns, which I find surprising. The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.
That leads to a concern that we have in relation to the adequacy decision that is in place between the EU and the United Kingdom. Article 52 of the GDPR states very clearly that a supervisory authority must have clear independence. The provisions relating to the independence of the Commission—the potential interference of the Secretary of State in law is enough to undermine independence—are therefore of concern to us.
Alexandra Sinclair: We would just say that it is not typical for an independent regulator to have its strategic objectives set by a Minister, and for a Minister to set those priorities without necessarily consulting. We consider that the ICO, as subject matter experts, are probably best placed to do that.
Jacob Smith: From our perspective, the only thing to add is that one way to improve the clauses on national security certificates and designation notices would be to give the ICO an increased role in oversight and monitoring, for instance. Obviously, if there are concerns about its independence, we would want to consider other mechanisms.
Q
Ms Irvine: Certainly. There are terms that have been used in data protection law since the 1984 Act. They were used again in the 1998 Act, echoed under the GDPR and included in all the guidance that has come from the Information Commissioner’s Office over the past number of years. In addition to that, there is case law that has interpreted many of those terms. Some of the proposed changes in the Bill introduce unexpected and unusual terms that will require interpretation. Even then, once we have guidance from the Information Commissioner, that guidance is sometimes not as helpful as interpretation by tribunals and courts, which is pretty sparse in this sector. The number of cases coming through the courts is limited—albeit that there is a lot more activity in the sector than there used to be. It simply presents a lot more questions and uncertainty in certain ways.
For my business clients, that is a great difficulty, and I certainly spend a lot of time advising clients on how I believe a matter—a phrase—will be interpreted, because I have knowledge of how data protection law works in general. That is based on my experience of the power of businesses and organisations, particularly in the third sector. Smaller bodies will often be challenged by a lack of knowledge and expertise, and that is a difficulty of introducing in legislation brand-new terms that are not familiar to practitioners, far less the organisations asked to implement the changes.
Q
Ms Irvine: I expect that you have heard a lot of warnings about safety. I echo what Alexandra said earlier about the removal of the right not to have automated decisions taken by organisations. That is something that we were concerned to see in a society where this is happening more and more. The particular example that we gave came from a study that had been carried out by the Equality and Human Rights Commission. That was looking particularly at decision making in local authorities; at how AI or algorithms were being used to take decisions without enough transparency; and at whether this gave the individuals the right to challenge those decisions, which stems from the transparency that is built in. The challenge for any organisation using any automated decision making—particularly in the public sector, I would submit, where the impact can be extremely significant, particularly if we are talking about benefits—is making sure these organisations understand what the technology is doing, explaining that to individuals and giving them the right to object.
The changes in the Bill relax the restrictions on automated decision making and allow that to happen almost as a default, with safeguards as an add-on, whereas article 22 as currently drafted provides a right not to have automated decisions taken about an individual unless certain circumstances apply. To echo what Alexandra said, when more and more decisions are being made automatically without a human intervening, and certainly without a human intervening at the appropriate stage to prevent damage or harm to individuals, it would absolutely seem like the wrong time to make these changes and relaxations to the regime.
You have all been superstars in our 10th panel. Thank you very much indeed for the evidence you have given this afternoon. We will now move on to the next panel.
Examination of Witness
Alex Lawrence-Archer gave evidence.
We now come to our 11th and final panel. We are pleased to welcome Alex Lawrence-Archer, who is a solicitor for AWO. We have until 4.40 pm for this session. Alex, will you please introduce yourself to the Committee for the record?
Alex Lawrence-Archer: Hi, I am Alex Lawrence-Archer. I am a solicitor and I litigate data rights cases at AWO. We were also instructed by Reset to help it to formulate its written evidence to the Committee, which hopefully you have received in the last couple of days.
Q
Alex Lawrence-Archer: There is a group of changes in the Bill that, perhaps in ways that were unintended or at least not fully thought through, quite seriously undermine the protection of individuals’ privacy and data rights. A few of the most concerning ones are the change to the definition of personal data, recognising legitimate interests, purpose limitation, changes to the test for the exercise of data subject rights—I could go on. You will have heard about many of those today. It amounts to an undermining of data rights that seems not to be in proportion to the relatively modest gains in terms of reduction in bureaucracy on the part of data controllers.
Q
Alex Lawrence-Archer: It is quite difficult to predict, because it is complicated, but it is foundational to the regime of data protection. One of the issues is that in seeking to relieve data controllers of certain bureaucratic requirements, we are tinkering with these really foundational concepts such as lawful basis and the definition of personal data.
Two things could happen, I think. Some quite bad-faith arguments could be run to take quite a lot of processing outside the scope of the data protection regime. Although I doubt that those arguments would succeed, there is an additional issue; it is quite complicated to explain, but I will try. If it is unlikely but possible that an individual might be re-identified from a pseudonymised dataset—it could happen if there were a hack, say, but it is unlikely—that processing under the new regime would not, as the Bill is drafted, benefit from the protection of the regime. It would not be considered personal data, as it would not be likely that the individual could be identified from that dataset. That is a real problem because pseudonymised datasets are very common with large datasets. There are real risks there that would not be dealt with.
Q
Alex Lawrence-Archer: Under the current regime, that is a bit like asking, “How long is a piece of string?” It can take quite a long time. There are certain practices that the ICO follows in terms of requiring individuals to complain to the controller first. Some controllers are good; some are quick, but some are not. You might have a lot of back and forth about data access at the beginning, but other controllers might hand over your data really quickly. However, you could be looking at anything up to, say, 10 to 12 months.
Q
Alex Lawrence-Archer: Yes. You have heard from lots of people about the changes to the standard to be applied when any of the rights in chapter 3 are exercised by a data subject, and that includes the right of access. I think it is very likely that many more exercises of the right of access will be refused, at least initially. I think there will be many more complaints about the right of access and there is likely to be satellite litigation about those complaints as well, because you cannot proceed in finding out what has gone on with your data and rectify a problem unless you have access to the copies of it.
So, what you might find in many cases is a two-stage process whereby, first, you must resolve a complaint, maybe even a court case, about your right to access the data and then, and only then, can you figure out what has actually been going on with it and resolve the underlying unlawfulness in the processing. Effectively, therefore, it is a doubling of the process for the individual.
Q
Alex Lawrence-Archer: The new definitions, particularly the list of factors to be taken into consideration in determining whether the test is met, provide a lot of breathing room for controllers, whether or not they have good intentions, to make arguments that they do not need to comply with the right of access. If you are looking not to comply or if you have an incentive not to, as many controllers do, that does not necessarily mean that you are acting in bad faith; you might just not want to hand over the data and think that you are entitled not to do so. If you are looking not to comply, you will look at the Act and see lots of hooks that you can hang arguments on. Ultimately, that will come back to individuals who are just trying to exercise their rights and who will be engaged in big arguments with big companies and their lawyers.
Q
Alex Lawrence-Archer: The age-appropriate design code was a real success for the UK in terms of its regulation and its reputation internationally. It clarified the rights that children have in relation to the processing of their personal data. However, those rights are only helpful if you know what is happening to your personal data, and if and when you find out that you can exercise your rights in relation to that processing.
As I have said, what the Bill does—again, perhaps inadvertently—is undermine in a whole host of ways your ability to know what is happening with your personal data and to do something about it when you find out that things have gone wrong. It seems to me that on the back of a notable success in relation to the AADC, we are now, with this Bill, moving in rather a different direction in terms of that argument for protection of personal data.
Looking at the even longer term, there will be some slightly more nuanced changes if and when the AADC comes to be amended or redrafted, because of the role of the ICO and the factors that it has to take into account in its independence, which again you have already heard about. So you could, in the long term, see a new version of the AADC that is more business-friendly, potentially, because of this Bill.
Q
Alex Lawrence-Archer: There are a bunch of different ways in which companies will take advantage of the new grey areas that the Bill opens up to carry out processing with less transparency and less respecting of the rights of the people whose data they are processing. If we take just the definition of research, for example, it will be much easier to carry out research for a large platform that already has lots of personal data. The GDPR already provides for a lot of exemptions when you are carrying out research; the Bill dramatically expands that definition. If you are a Google or a YouTube, then yes, you are much freer to carry out processing that you consider to be research without necessarily being transparent about it to the users affected, those whose data it concerns.
Q
Alex Lawrence-Archer: We need to distinguish between two things: one is the introduction of some examples of what may be legitimate interests, which is not a particular concern because they replicate what is already in a recital; and, separately and of much greater concern, the introduction of recognised legitimate interests. I think that that is quite a radical departure from legitimate interests under the current regime. The Bill possibly misguides people, because it uses the language of legitimate interests, but it works in a very different way.
If you have a legitimate interest under the current regime, you must balance your interests against those of data subjects, and that is not something that is required if you can rely on a recognised legitimate interest under the new regime. The recognised legitimate interests are very broad—prevention of crime, for example, does not mean that that has to be done by the police. That is about opening up such processing for any kind of controller, which could be your neighbour or local corner shop, who can rely on that recognised legitimate interest with no requirement to consider the data subject’s interest at all. That is a radical departure, because the concept of balancing the interests of the data subject and of the controller is absolutely fundamental to our current regime.
Q
Alex Lawrence-Archer: I do not want to overstate the case. You must be able to demonstrate that the processing is necessary for a recognised legitimate interest; it has got to make sense—but you do not have to consider anyone else’s interests.
For example, in some recent cases, neighbours were operating CCTV that captured lots of the personal data of their neighbours. An important argument to show that that was unlawful was that yes, the processing was necessary for the detection of crime—that is what the CCTV was for—but the interests of the neighbours, views of whose gardens and front windows were being captured, overrode the legitimate interests of the controller. That is how it works under the current regime. Under the new regime, you would not have to consider the interests of the neighbours in the use of that CCTV system. You would be able to rely on the recognised legitimate interest.
Q
Alex Lawrence-Archer: Yes.
Q
Alex Lawrence-Archer: I think the Bill is quite big tech-friendly, and the way that it deals with research is well illustrative of that. One of the objectives of the Bill is obviously to boost the use of personal data for academic research, which is a really laudable objective. However, the main change—in fact the only change I can think of off the top of my head—that it makes is to broaden the definition of academic research. That helps people who already have lots of personal data they might do research with; it does not help you if you do not have personal data. That is one of the major barriers for academics at the moment: they cannot get access to the data they need.
The Bill does nothing to incentivise or compel data controllers such as online platforms to actually share data and get it moving around the system for the purposes of academic research. This is in stark contrast to the approach being taken elsewhere. It is an issue the EU is starting to grapple with in a particular domain of research with article 40 of the Digital Services Act. There is a sense that we are falling behind a little bit on that key barrier to academic research with personal data.
Q
Alex Lawrence-Archer: I certainly recognise that the requirements of GDPR place compliance burdens on businesses of all sizes. I am sceptical that the right balance is being struck in trying to ameliorate the burdens of the costs and challenges that ordinary people will face—in terms of knowing how they are being profiled and tracked by companies—and resolving things when they have gone wrong. I am sceptical as well that there will be major benefits to many businesses who will continue to need to do business in Europe. For that reason, we will need either to have dual compliance or simply to continue to comply with EU GDPR. You can see this benefiting the largest companies, which can start to segment their users. We have already seen that with Meta, which moved its users on to US controllership, for example. I would see that as more beneficial to those large companies, which can navigate that, rather than, say, SMEs.
Mr Lawrence-Archer, thank you very much for your time this afternoon.
That brings us to the end of our 11th panel. As an impartial participant in these proceedings—we have had over four-and-a-half hours of evidence with 23 witnesses —I would say it has been an absolute masterclass in all the most topical issues in data protection and digital information. Members might not realise it, but that is what we have had today.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesI have a few preliminary announcements that Mr Speaker would like me to make. Hansard colleagues would be grateful if Members emailed their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent mode. Tea and coffee are not allowed during sittings.
The selection list for today’s sitting, which is available in the room, shows how the selected amendments have been grouped for debate. Grouped amendments are generally on the same or a similar issue. Please note that decisions on amendments will take place not in the order in which they are debated, but in the order in which they appear on the amendment paper. The selection and grouping list shows the order of debates. Decisions on each amendment will be taken when we come to the clause to which the amendment relates.
The Member who has put their name to the lead amendment in a group will be called first. Other Members will then be free to catch my eye to speak on all or any of the amendments within that group. A Member may speak more than once in a single debate. At the end of a debate on a group of amendments, I shall again call the Member who moved the lead amendment. Before they sit down, they will need to indicate to me whether they wish to withdraw the amendment or to seek a decision. If any Member wishes to press any other amendment in a group to a vote, they will need to let me know.
Clause 1
Information relating to an identifiable living individual
Question proposed, That the clause stand part of the Bill.
It is a pleasure to serve under your chairmanship, Mr Hollobone. May I thank all hon. Members for volunteering to serve on the Committee? When I spoke on Second Reading, I expressed my enthusiastic support for the Bill—just as well, really. I did not necessarily expect to be leading on it in Committee, but I believe it is a very important Bill. It is complex and will require quite a lot of scrutiny, but it will create a framework of real benefit to the UK, by facilitating the exchange of data and allowing us to take the maximum advantage of emerging technologies. I look forward to our debates over the next few days.
Clause 1 will create a test in legislation to help organisations to understand whether the data that they are processing is personal or anonymous. This is important, because personal data is subject to data protection rules but anonymous data is not. If organisations can be confident that the data they are processing is anonymous, they will be able to use it for important activities such as research and product development without concern about the potential impact on individuals’ personal data.
The new test will require data controllers considering whether data is personal or anonymous to consider two scenarios. The first is where a living individual can be identified by somebody within the data controller or processor’s own organisation using reasonable means at any point at which the data is being processed, from the initial point of collection for its use and storage to its eventual deletion or onward transmission. The second scenario is where the data controller or processor knows or should reasonably know that somebody outside the organisation is likely to obtain the information and to be able to re-identify individuals from it using reasonable means. That could be a research partner or a business client with whom the data controller intends to share the data, or an outside organisation that obtains the data as a result of the data controller not putting adequate security measures in place.
What would be considered “reasonable means” in any given case takes into account, among other things, the time, effort and cost of identifying the individual, as well as the technology available during the time the processing occurs. We hope that the clarity the test provides will give organisations greater confidence about using anonymous data for a range of purposes, from marketing to medical research. I commend the clause to the Committee.
It is a pleasure to serve under your chairship, Mr Hollobone. I echo the Minister’s thanks to everyone serving on the Bill Committee; it is indeed a privilege to be here representing His Majesty’s loyal Opposition. I look forward to doing our constitutional duty as we scrutinise the Bill today and in the coming sittings.
The definition of personal data is critical, not only to this entire piece of legislation, but to the data protection regime more widely. That is because the definition of what counts as personal data sets the parameters on who will benefit from protections and safeguards set out by the legislation, and, looking at it from the other side, the various protections will not apply when data is not classed as personal. It is therefore important that the definition should be clear for both controllers and data subjects, so that everyone understands where regulations and, by extension, rights do and do not apply.
The Bill defines personal data as that where a data subject can be identified by a controller or processor, or anyone likely to obtain the information,
“by reasonable means at the time of processing”.
According to the Bill, “reasonable means” take into account the time, effort, costs, technology and resources available to the person. The addition of “reasonable” to the definition has caused major concern among civil society groups, which are worried that it will introduce an element of subjectivity from the perspective of the controller when determining whether data is personal or not. Indeed, although recital 26 of the General Data Protection Regulation also refers to reasonable means—making this, in some ways, more of a formal change than a practical one—there must still be clear parameters on how controllers or processors are to make that judgment. Without those, there may be a danger of controllers and processors avoiding the requirement to comply with rules around personal data by simply claiming they do not have the means to identify living individuals within their resources.
Has the Department undertaken an impact assessment to determine whether the definition could, first, increase subjectivity in what counts as personal data, or secondly, reduce the amount of data classified as personal data? If an assessment identifies such a risk, what steps will the Department take to mitigate that and ensure that citizens are able to exercise their rights as they can under the current definition?
Other stakeholders have raised concerns that the phrase
“at the time of the processing”
in the definition might imply that there is no continuous obligation to consider whether data is personal. Indeed, under the current definition, where personal data is
“any information that relates to an identified or identifiable living individual”,
there is an implied obligation to consider whether an individual is identifiable on an ongoing basis. Rather than assessing the identifiability of a dataset at a fixed point, the controller or processor must keep the categorisation of data that it holds under careful review, taking into account technological developments, such as sophisticated new artificial intelligence or cross-referencing tools. Inserting the phrase
“at the time of the processing”
into this definition has prompted the likes of Which? to express concern that some processors may feel that they are no longer bound by this continuous obligation. That would be particularly worrying given the potential subjectivity of the new definition. If whether an individual is identifiable is based on “reasonable means”, including one’s resources and technology, it is perfectly feasible that, with a change of resources or technology, it could become reasonable to identify a person when once it was not.
My hon. Friend is making an excellent speech. Does she agree that the absence of regard for the rate of technological change, particularly the rise of artificial intelligence—datasets are now being processed at phenomenal speeds—is potentially negligent on the part of the Government?
My hon. Friend makes an important point, which I will come to later.
In these circumstances, it is crucial that if a person is identifiable through data at any time in the future, the data is legally treated as personal so that the relevant safeguards and rights that GDPR was designed to ensure still apply.
When arguing for increased Secretary of State powers across the Bill, Ministers have frequently cited the need to future-proof the legislation. Given that, we must also consider the need to future-proof the definition of data so that technological advances do not render it useless. Does the new definition involve a continuous obligation to assess whether data is personal? Will guidance be offered to inform both controllers and data subjects on the application of this definition, so that both sides can be clear on how it will work in practice? As 5Rights has pointed out, that could avoid clogging up the regulator’s time with claims about what counts as personal data in many individual cases.
Finally, when determining whether data is personal, it is also vital that controllers take into account how a determined stalker or malicious actor might find and use their data. It is therefore good to see the change made since the first iteration of the Data Protection and Digital Information Bill, to clarify that
“obtaining the information as a result of the processing”
also includes information obtained as a result of inaction by a controller or processor—for example, as the result of a failure to put in place appropriate measures to prevent or reduce the risk of hacking.
Overall, it is important that we give both controllers and data subjects clarity about which data is covered by which protections, and when. I look forward to hearing from the Minister about the concerns that have been raised, which could affect the definition’s ability to allow for that clarity.
I agree absolutely with the hon. Lady that the definition of personal data is central to the regime that we are putting in place. She is absolutely right that we need to be very clear and to provide organisations with clarity about what is within the definition of personal data and what is rightly considered to be anonymous. She asks whether the provision will lead to a reduction in the current level of protection. We do not believe that it will.
Clause 1 builds on the strong foundations used in GDPR recital 26 to clarify when data can be categorised as truly anonymous without creating undue risks. The aim of the provision in the Bill is to clarify when information should be considered to be personal data by including a test for identifiability in the legislation. That improved clarity will help organisations to determine when data can be considered truly anonymous and therefore pose almost no risk to the data subject.
The hon. Lady asked whether
“at the time of the processing”
extends into the future, and the answer is yes. The definition of data processing in the legislation is very broad and includes a lot of processing activities other than just the collection of data, such as alteration, retrieval, storage and disclosure by transmission, to name just a few. The phrase
“at the time of the processing”
could therefore cover a long period, depending on the nature and purpose of the processing. The test would need to be applied afresh for each new act of processing. That means that if at any point in the life cycle of processing, the data could be reasonably re-identified by someone by reasonable means, they would then not be able to legally consider to be anonymous. That includes transferring abroad to other regimes.
The clause makes it clear that a controller will have to consider the likelihood of re-identification at all stages of the processing activity. If a data controller held a dataset for several years, they would need to be mindful of the technologies available during that time that might be used to re-identify it. As the hon. Lady said, technology is advancing very fast and could well change over time from the point at which the data is first collected.
I appreciate the Minister’s clarification. He has just said that the test of identification would apply when sharing the data with another authority. However, once that has been done, the test no longer applies. Does he accept that it is possible for data to be shared that could not by this test reasonably be identified but that, over time, in a different authority, could reasonably be identified, without the data subject having any redress?
If data is shared and then held by a new controller, it will be still subject to the same protections even though it has been transferred from the original. It is important that there should be the ability to continue to apply protection no matter what technology evolves over the course of time, but it will still be subject to the same protection and, of course, still be enforceable through the Information Commissioner.
Would it be subject to the same protection if it was transferred abroad?
Again, yes, it will. It will be transferred abroad only if we are satisfied that the recipient will impose the same level of protection that we regard as necessary in this country.
Question put and agreed to.
Clause 1 accordingly ordered to stand part of the Bill.
Clause 2
Meaning of research and statistical purposes
I beg to move amendment 66, clause 2, page 4, line 8, at end insert—
“(c) do not include processing of personal data relating to children for research carried out as a commercial activity.”
This amendment would exempt children’s data from being used for commercial purposes under the definition of scientific purposes in this clause.
With this it will be convenient to discuss:
Amendment 65, clause 2, page 4, line 21, at end insert—
“7. The Commissioner must prepare a code of practice under section 124A of the Data Protection Act 2018 on the interpretation of references in this Regulation to “scientific research”.
8. The code of practice prepared under paragraph 7 must include examples of the kinds of research purposes, fields, controllers, and ethical standards that are to be considered as being scientific, and those that are excluded from being so considered.”
This amendment would require a statutory code of practice from the ICO on how the definition of scientific research in this clause is to be interpreted.
Clause stand part.
Fuelling safe scientific research through data will be vital to support the UK’s ambition to become a science superpower. We understand that, as is the case in many areas of data protection law, lack of clarity about what counts as processing for scientific purposes causes organisations to take a risk-averse approach to conducting research. An understanding of exactly what is included would therefore give organisations confidence they need to conduct vital processing that will allow for the scientific discoveries and benefits of the future.
Unfortunately, the clause makes the same mistake as the Bill does in general by focusing on easing regulations on those who hold data, rather than looking at how data can be harnessed for the general greater good. It misses the opportunity to unlock the benefits of safely redistributing and sharing data. Indeed, none of the clauses on processing for research purposes make any attempt to explore options to incentivise controllers to share their data with independent researchers. Similarly, the Bill does not explore how the likes of data trusts or co-operatives that pool data resources in the interests of a larger group of beneficiaries or organisations could create a stronger environment for research. Instead, it leaves those who already collect and hold data to benefit from the regime by processing for their own research purposes, while those who might hope to collaborate will use alternative data sets and are no better off.
By failing to think about the safe sharing of data to fuel scientific research, the Government limit the progress the UK could make as a powerhouse of science innovation. The Bill leaves only those organisations with large amounts of data able to contribute to such progress, entrenching existing power structures and neglecting the talent held in the smaller independent organisations that would otherwise be able to conduct research for the public good.
Turning to amendment 65, it has always been written into the GDPR, in recital 159, that processing for scientific purposes should be interpreted broadly. It is therefore understandable why Ministers provided a broad definition in the Bill that allows for those conducting genuine scientific research to have absolute confidence that their processing falls under this umbrella, preventing a risk-averse environment. However, stakeholders, including Reset.tech and the Ada Lovelace Institute, have expressed worries that clause 2 goes a little too far, essentially providing a blank cheque for private companies to self-identify as conducting scientific research as a guise for processing personal information for any purpose they choose.
All that must be understood in combination with clause 9, which gives organisations an exemption from purpose limitation, allowing them to reuse data as long as it is for scientific purposes, as defined in clause 2. Indeed, though the Bill contains a few clarifications of what the definition in clause 2 includes, such as publicly and privately funded processing, commercial or non-commercial processing and processing for the likes of technological development, fundamental research, or applied research, I am keen to hear from the Minister about what specific purposes would actually be ruled out under the letter of the current definition. For example, as the Ada Lovelace Institute asked, would pseudoscientific applications, such as polygraphy or experimental AI claiming to predict an individual’s religion, politics or sexuality, be categorically ruled out under the current definition?
Though it may not be the intention in the clause to enable malicious or pseudoscientific processing under the definition of science, we must ensure that the definition is not open to exploitation, or so broad that any controller could reasonably identify their processing as falling under it. Regulator guidance would be in a prime position to do that. By providing context as to what must be considered for something to be reasonably classified as scientific—for example, the purpose of the research, the field of research, the type of controller carrying it out, or the methodological and ethical standards used—controllers using the definition legitimately will feel even more assured, and malicious processing will be explicitly excluded from the application of the definition. Amendment 65 would do nothing to stop genuinely scientific research from benefiting from the changes in this Bill and would provide further clarity around how the definition can be legitimately relied upon.
I wish to pose a couple of questions, after two thoughtful and well-presented amendments from those on the Opposition Front Bench. With regard to children and the use of apps such as TikTok, what assurance will the Government seek to ensure that companies that process and store data abroad are abiding by the principles of our domestic legislation? I mention TikTok directly because it stores data from UK users, including children, in Singapore, and it has made clear in evidence to the Joint Committee on the Online Safety Bill that that data is accessed by engineers in China who are working on it.
We all know that when data is taken from a store and used for product development, it can be returned in its original state but a huge amount of information is gathered and inferred from it that is then in the hands of engineers and product developers working in countries such as China and under very different jurisdictions. I am interested to know what approach we would take to companies that store data in a country where we feel we have a data equivalence regime but then process the data from a third location where we do not have such a data agreement.
I welcome the recognition of the importance of allowing genuine research and the benefits that can flow from it. Such research may well be dependent on using data and the clause is intended to provide clarity as to exactly how that can be done and in what circumstances.
I will address the amendments immediately. I am grateful to the hon. Member for Barnsley East for setting out her arguments and we understand her concerns. However, I think that the amendments go beyond what the clause proposes and, in addition, I do not think that there is a foundation for those concerns. As we have set out, clause 2 inserts in legislation a definition for processing for scientific research, historical research and statistical purposes. The definition of scientific research purposes is set out as
“any research that can be reasonably described as scientific”
and I am not sure that some of the examples that the hon. Lady gave would meet that definition.
The definitions inserted by the clause are based on the wording in the recitals to the UK GDPR. We are not changing the scope of these definitions, only their status in the legislation. They will already be very familiar to people using them, but setting them out in the Bill will provide more clarity and legal certainty. We have maintained a broad scope as to what is allowed to be included in scientific research, with the view that the regulator can add more nuance and context through guidance, as is currently the case. The power to require codes of practice provides a route for the Secretary of State to require the Information Commissioner to prepare any code of practice that gives guidance on good practice in processing personal data.
There will be situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. Examples of the types of activity that are considered scientific research and the indicative criteria that a researcher should demonstrate are best placed in non-statutory guidance produced by the Information Commissioner’s Office. That will give flexibility to amend and change the examples when necessary, so I believe that the process does not change the provision. However, putting it in the legislation, rather than in the recitals, will impose stronger safeguards and make things clearer. Once the Bill has come into effect, the Government will continue to work with the ICO to update its already detailed and helpful guidance on the definition of scientific research as necessary.
Amendment 66 would prohibit the use of children’s data for commercial purposes under the definition of scientific research. The definition inserted by clause 2 includes the clarification that processing for scientific research carried out as a commercial activity can be considered processing for scientific research purposes. Parts of the research community asked for that clarification in response to our consultation. It reflects the existing scope, as is already clear from the ICO’s guidance, and we have seen that research by commercial bodies can have immense societal value. For instance, research into vaccines and life-saving treatments is clearly in the public interest. I entirely understand the hon. Lady’s concern for children’s privacy, but we think that her amendment could obstruct important research by commercial organisations, such as research into children’s diseases. I think that the Information Commissioner would make it clear as to whether or not the kind of example that the hon. Lady gave would fall within the definition of research for scientific purposes.
I also entirely understand the concern expressed by my hon. Friend the Member for Folkestone and Hythe. I suspect that the question about the sharing of data internationally, particularly, perhaps, by TikTok, may recur during the course of our debates. As he knows, we would share data internationally only if we were confident that it would still be protected in the same way that it is here, which would include considering the possibility of whether or not it could then be passed on to a third country, such as China.
I hope that I can reassure the hon. Lady that emphasising the safeguards that researchers must comply with in clause 22 to protect individuals relates to all data used for these purposes, including children’s data and the protections afforded to children under the UK GDPR. For those reasons, I hope that she will be willing to withdraw her amendment.
I am disappointed that the Minister does not accept amendment 66. Let me make a couple of brief points about amendment 65. The Minister said that he was not sure whether some of the examples I gave fitted under the definition, and that is what the amendment speaks to. I asked what specific purposes would be ruled out under the letter of the current definition, and that is still not clear, so I will press the amendment to a vote.
Question put, That the amendment be made.
The clause clarifies how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. It clarifies an existing concept of “broad consent” that is currently found in the recitals. The measure will enable consent to be obtained for an area of scientific research when the researcher cannot fully identify the purposes for which they are collecting the data.
Consent under UK GDPR must be for a specific purpose, but in scientific research the precise purpose may not be fully known when the data is collected. For example, the initial aim may be the study of cancer, and then later becomes the study of a particular cancer type. Currently, the UK GDPR recitals clarify that consent may be given for an area of scientific research, but as the recitals are only an interpretative aid that may not give scientists the certainty that they need. The clause will therefore add the ability to give broad consent for scientific research into the operative text of the UK GDPR, giving scientists greater certainty and confidence. The clause contains a number of safeguards to protect against misuse. That includes the requirement that seeking consent is consistent with ethical standards that are generally recognised and relevant to that area of research.
With regard to clause 3, I refer Members to my remarks on clause 2. It is sensible to clarify how controllers and processors conducting scientific research can gain consent where it is not possible to fully identify the full set of uses for that data when it is collected. However, what counts as scientific, and therefore what is covered by the clause, must be properly understood by both data subjects and controllers through proper guidance issued by the ICO.
Clause 4 is largely technical and inserts the recognised definition of consent into part 3 of the Data Protection Act 2018, for use when it is inappropriate to use one of the law enforcement purposes. I will talk about law enforcement processing in more detail when we consider clauses 16, 24 and 26, but I have no problem with the definition in clause 4 and am happy to accept it.
I am grateful to the hon. Lady for her support. I agree with her on the importance of ensuring that the definition of scientific research is clear. That is something on which I have no doubt the ICO will also issue guidance.
Question put and agreed to.
Clause 3 accordingly ordered to stand part of the Bill.
Clause 4 ordered to stand part of the Bill.
Clause 5
Lawfulness of processing
I beg to move amendment 68, in clause 5, page 6, line 37, at end insert—
“7A. The Secretary of State may not make regulations under paragraph 6 unless—
(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),
(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and
(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”
This amendment would make the Secretary of State’s ability to amend the conditions in Annex 1 which define “legitimate interests” subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.
With this it will be convenient to discuss the following:
Amendment 67, in clause 5, page 7, line 18, at end insert—
“11. Processing may not be carried out in reliance on paragraph 1(ea) unless the controller has published a statement of—
(a) which of the conditions in Annex 1 has been met which makes the processing necessary,
(b) what processing will be carried out in reliance on that condition, or those conditions, and
(c) why that processing is proportionate to and necessary for the purpose or purposes indicated in the condition or conditions.”
This amendment would require controllers to document and publish (e.g. in a privacy notice) a short statement on their reliance on a “recognised legitimate interest” for processing personal data.
Clause stand part.
At present, the lawful bases for processing are set out in article 6 of the UK GDPR. At least one of them must apply whenever someone processes personal data. They are consent, contract, legal obligation, vital interests, public task, and legitimate interests. That is where data is being used in ways that we would reasonably expect, there is minimal privacy impact, or there is a compelling justification for processing. Of the existing lawful bases, consent is by far the most relied upon, as it is the most clear. There have therefore been calls for the other lawful bases to be made clearer and easier to use. It is welcome to see some examples of how organisations might rely on the legitimate interests lawful ground brought on to the statute book.
At the moment, in order to qualify for using legitimate interests as grounds for lawful processing, a controller must also complete a balancing test. The balancing test is an important safeguard. As per the ICO, it requires controllers to consider the interests and fundamental rights and freedoms of the individual, and whether they override the legitimate interests that the controller has identified. That means at a minimum considering the nature of the personal data being processed, the reasonable expectations of the individual, the likely impact of processing on the individual, and whether any safeguards can be put in place to mitigate any negative impacts.
As tech.UK mentioned, the introduction of a list of legitimate interests no longer requiring that test is something many have long called for. When conducting processing relating to an emergency, for example, the outcome of a balancing test often very obviously weighs in one direction, making the decision straightforward, and the test itself an administrative task that may slow processing down. It makes sense in such instances that a considered exemption might apply.
However, given the reduction in protection and control for consumers when removing a balancing test, it is vital that a list of exemptions is limited and exhaustive, and that every item on such a list is well consulted on. It is also vital that the new lawful basis cannot be relied upon in bad faith or exploited by those who simply want to process without the burden, for reasons outside of those listed in annex 1. The Bill as it currently stands does not do enough to ensure either of those things, particularly given the Secretary of State’s ability to add to the list on a whim.
I turn to amendment 67. Although it is likely not the intention for the clause to be open to exploitation, Reset.tech, among many others, has shared concerns that controllers may be able to abuse the new lawful basis of “recognised legitimate interests”, stretching the listed items in annex 1 to cover some or all of their processing, and giving themselves flexibility over a wide range of processing without an explicit requirement to consider how that processing affects the rights of data and decision subjects. That is particularly concerning where controllers may be able to conflate different elements of their processing.
Reset.tech and AWO provide a theoretical case study to demonstrate that point. Let us say that there is a gig economy food delivery company that processes a range of data on workers, including minute-by-minute location data. That location data would be used primarily for performance management, but could occasionally be used in more extreme circumstances to detect crime—for example, detecting fraud by workers who are making false claims about how long they waited for an order to be ready for delivery. By exploiting the new recognised legitimate interests basis, the company could conflate its purposes of performance management and detecting crime, and justify the tracking of location data as a whole as being exempt from the balancing test, without having to record or specify exactly which processing is for the detection of crime.
Under the current regime, there remain two tests other than the balancing test that form a complete assessment of legitimate interests and help to prevent conflation of that kind. First, there is the purpose test, which requires the controller to identify which legitimate interest the company is relying upon. Secondly, there is the necessity test, which requires the controller to consider whether the processing that the company intends to conduct is necessary and proportionate to meet its purposes.
In having to conduct those tests, the food delivery company would find it much more difficult to conflate its performance management and crime prevention purposes, as it would have to identify and publicly state exactly which elements of its processing are covered by the legitimate interest purpose of crime prevention. That would make it explicit that any processing the company conducts for the purposes of performance management is not permitted under a recognised legitimate interest, meaning that a lawful basis for that processing would be required separately.
Amendment 67 therefore seeks to ensure that the benefits of the purpose and necessity tests are retained, safeguarding the recognised legitimate interests list from being used to cynically conflate purposes and being exploited more generally. In practice, that would mean that controllers relying on a purpose listed in annex 1 for processing would be required to document and publish a notice that explains exactly which processing the company is conducting under which purpose, and why it is necessary.
It is foundational to the GDPR regime that each act of processing has a purpose, so this requirement should just be formalising and publishing what controllers are already required to consider. The measure that the amendment seeks to introduce should therefore be no extra burden on those already complying in good faith, but should still act as a barrier to those attempting to abuse the new basis.
I turn to amendment 68. As the likes of Which? have argued, any instance of removing the balancing test will inevitably enable controllers to prioritise their interests in processing over the impact on data subjects, resulting in weaker protections for data subjects and weaker consumer control. Which? research, such as that outlined in its report “Control, Alt or Delete? The future of consumer data”, also shows that consumers value control over how their data is collected and used, and that they desire more transparency, rather than less, on how their data is used.
With those two things in mind—the value people place on control of their data and the degradation of that control as a result of removing the balancing test—it is vital that the power to remove the balancing test is used extremely sparingly on carefully considered, limited purposes only. Even for those purposes already included in annex 1, it is unclear exactly what impact assessment took place to ensure that the dangers of removing the test on the rights of citizens did not outweigh the positives of that removal.
It would therefore be helpful if the Minister could outline the assessment and analysis that took place before deciding the items on the list. Although it is sensible to future-proof the list and amend it as needs require, this does not necessarily mean vesting the power to do so in the Secretary of State’s hands, especially when such a power is open to potential abuse. Indeed, to say that the Secretary of State must have regard to the interests and fundamental rights and freedoms of data subjects and children when making amendments to the list is simply not a robust enough protection for citizens. Our laws should not rely on the good nature of the Secretary of State; they must be comprehensive enough to protect us if Ministers begin to act in bad faith.
Further, secondary legislation simply does not offer the scrutiny that the Government claim it does, because it is rarely voted on. Even when it is, if the Government of the day have a majority, defeating such a vote is incredibly rare. For the method of changing the list to be protected from the whims of a bad faith Secretary of State who simply claims to have had regard to people’s rights, proper consultation should be undertaken by the regulator on any amendments before they are considered for parliamentary approval.
This amendment would move the responsibility for judging the impact of changes away from the Secretary of State and place it with the regulator on a yearly basis, ensuring that amendments proceed only if they are deemed, after consultation, to be in the collective societal interest. That means there will be independent assurance that any amendments are not politically or maliciously motivated. This safeguard should not be of concern to anyone prepared to act in good faith, particularly the current Secretary of State, as it would not prevent the progression in Parliament of any amendments that serve the common good. The amendment represents what genuine future-proofing in a way that retains appropriate safeguards looks like, as opposed to what ends up looking like little more than an excuse for a sweeping power grab.
I welcome the hon. Lady’s recognition of the value of setting out a list of legitimate interests to provide clarity, but I think she twice referred to the possibility of the Secretary of State adding to it on a whim. I do not think we would recognise that as a possibility. There is an established procedure, which I would like to go through in responding to the hon. Lady’s concerns. As she knows, one of the key principles of our data protection legislation is that any processing of personal data must be lawful. Processing will be lawful where an individual has given his or her consent, or where another specified lawful ground in article 6 of the UK GDPR applies. This includes where the processing is necessary for legitimate interests pursued by the data controller, providing that those interests are not outweighed by an individual’s privacy rights.
Clause 5 addresses the concerns that have been raised by some organisations about the difficulties in relying on the “legitimate interests” lawful ground, which is used mainly by commercial organisations and other non-public bodies. In order to rely on it, the data controller must identify what their interest is, show that the processing is necessary for their purposes and balance their interests against the privacy right of the data subject. If the rights of the data subject outweigh the interests of the organisation, the processing would not be lawful and the controller would need to identify a different lawful ground. Regulatory guidance strongly recommends that controllers document the outcome of their legitimate interests assessments.
As we have heard, and as the hon. Lady recognises, some organisations have struggled with the part of the legitimate interests assessment that requires them to balance their interests against the rights of individuals, and concern about getting the balancing test wrong—and about regulatory action that might follow as a result—can cause risk aversion. In the worst-case scenario, that could lead to crucial information in the interests of an individual or the public—for example, about safeguarding concerns—not being shared by third-sector and private-sector organisations. That is why we are taking steps in clause 5 and schedule 1 to remove the need to do the balancing test in relation to a narrow range of recognised legitimate activities that are carried out by non-public bodies. Those activities include processing, which is necessary for the purposes of safeguarding national security or defence; responding to emergencies; preventing crimes such as fraud or money laundering; safeguarding vulnerable individuals; and engaging with the public for the purposes of democratic engagement.
Will my right hon. Friend confirm whether the Information Commissioner’s advice will be published, either by the commissioner, the Minister or Parliament—perhaps through the relevant Select Committee?
I am not sure it would necessarily be published. I want to confirm that, but I am happy to give a clear response to the Committee in due course if my hon. Friend will allow me.
As well as the advice that the Information Commissioner supplies, the proposal is also subject to the affirmative procedure, as the hon. Member for Barnsley East recognised, so Parliament could refuse to approve any additions to the list that do not respect the rights of data subjects. She suggested that it is rare for an affirmative resolution to be rejected by Parliament; nevertheless, it is part of our democratic proceedings, and every member of the Committee considering it will have the opportunity to reach their own view and vote accordingly. I hope that reassures the hon. Lady that there are already adequate safeguards in place in relation to the exercise of powers to add new activities to the list of recognised legitimate interests.
Amendment 67, which the hon. Lady also tabled, would require data controllers to publish a statement if they are relying on the new recognised legitimate interests lawful ground. The statement would have to explain what processing would be carried out in reliance on the new lawful ground and why the processing is proportionate and necessary for the intended purpose. In our view, the amendment would significantly weaken the clause. It would reintroduce something similar to the legitimate interests assessment, which, as we have heard, can unnecessarily delay some very important processing activities. In scenarios involving national security or child protection, for example, the whole point of the clause is to make sure that relevant and necessary personal data can be shared without hesitation to protect vulnerable individuals or society more generally.
I hope the hon. Lady is reassured by my response and agrees to withdraw her amendments. I commend clause 5 to the Committee.
We do not believe that amendment 67 would place an extra burden on those who are already complying in good faith. The idea behind it is that it will be a barrier to those attempting to abuse the new basis.
On amendment 68, we should not have laws that rely on the Secretary of State’s good faith. As the Minister said, it is pretty rare for secondary legislation to be voted on, and for the Government to lose, so I do not see that as a barrier. The hon. Member for Folkestone and Hythe highlighted that although there are some protections, we do not believe that the Government protections go as far as we would like. For that reason, I will press the amendment to a vote.
Question put, That the amendment be made.
I beg to move amendment 30, in schedule 1, page 137, line 28, leave out “fourth day after” and insert
“period of 30 days beginning with the day after”.
Annex 1 to the UK GDPR makes provision about processing for democratic engagement purposes, including certain processing by elected representatives. This amendment increases the period for which former members of the Westminster Parliament and the devolved legislatures continue to be treated as "elected representatives" following an election. See also NC6 and Amendment 31.
With this it will be convenient to discuss the following:
Government amendment 31.
Government new clause 6—Special categories of personal data: elected representatives responding to requests.
That schedule 1 be the First schedule to the Bill.
As the Committee will be aware, data protection legislation prohibits the use of “special category” data—namely, information about a person that is sensitive in nature—unless certain conditions or exemptions apply. One such exemption is where processing is necessary on grounds of substantial public interest.
Schedule 1 to the Data Protection Act 2018 sets out a number of situations where processing would be permitted on grounds of substantial public interest, subject to certain conditions and safeguards. That includes processing by elected representatives who are acting with the authority of their constituents for the purposes of progressing their casework. The current exemption applies to former Members of the Westminster and devolved Parliaments for four days after a general election—for example, if the MP has been defeated or decides to stand down. That permits them to continue to rely on the exemption for a short time after the election to conclude their parliamentary casework or hand it over to the incoming MP. In practice, however, it can take much longer than that to conclude these matters.
New clause 6 will therefore extend what is sometimes known as the four-day rule to 30 days, which will give outgoing MPs and their colleagues in the devolved Parliaments more time to conclude casework. That could include handing over live cases to the new representative, or considering what records should be retained, stored and deleted. When MPs leave office, there is an onus on them to conclude their casework in a timely manner. However, the sheer volume of their caseload, on top of the other work that needs to be done when leaving office, means that four days is just not enough to conclude all relevant business. The new clause will therefore avoid the unwelcome situation where an outgoing MP who is doing his or her best to conclude constituency casework could be acting unlawfully if they continue to process their constituents’ sensitive data after the four-day time limit has elapsed. Extending the time limit to 30 days will provide a pragmatic solution to help outgoing MPs while ensuring the exemptions cannot be relied on for an indefinite period.
Government amendments 30 and 31 will make identical changes to other parts of the Bill that rely on the same definition of “elected representative”. Government amendment 30 will change the definition of “elected representative” when the term appears in schedule 1. As I mentioned when we debated the previous group of amendments, clause 5 and schedule 1 to the Bill create a new lawful ground for processing non-sensitive personal data, where the processing is necessary for a “recognised legitimate interest”. The processing of personal data by elected representatives for the purposes of democratic engagement is listed as such an interest, along with other processing activities of high public importance, such as crime prevention, safeguarding children, protecting national security and responding to emergencies.
Government amendment 31 will make a similar change to the definition of “elected representative” when the term is used in clause 84. Clauses 83 and 84 give the Secretary of State the power to make regulations to exempt elected representatives from some or all of the direct marketing rules in the Privacy and Electronic Communications (EC Directive) Regulations 2003. I have no doubt that we will debate the merits of those clauses in more detail later in Committee, but for now it makes sense to ensure that there is a single definition of “elected representative” wherever it appears in the Bill. I hope the hon. Member for Barnsley East and other colleagues will agree that those are sensible suggestions and will support the amendments.
This set of Government provisions will increase the period for which former MPs and elected representatives in the devolved regions can use the democratic engagement purpose for processing. On the face of it, that seems like a sensible provision that allows for a transition period so that data can be deleted, processed, or moved on legally and safely after an election, and the Opposition have a huge amount of sympathy for it.
I will briefly put on record a couple of questions and concerns. The likes of the Ada Lovelace Institute have raised concerns about the inclusion of democratic engagement purposes in schedule 1. They are worried, particularly with the Cambridge Analytica scandal still fresh in people’s minds, that allowing politicians and elected parties to process data for fundraising and marketing without a proper balancing test could result in personal data being abused for political gain. The decision to make processing for the purposes of democratic engagement less transparent and to remove the balancing test that measures the impact of that processing on individual rights may indicate that the Government do not share the concern about political processing. Did the Minister’s Department consider the Cambridge Analytica scandal when drawing up the provisions? Further, what safeguards will be in place to ensure that all data processing done under the new democratic engagement purpose is necessary and is not abused to spread misinformation?
I would only say to the hon. Lady that I have no doubt that we will consider those aspects in great detail when we get to the specific proposals in the Bill, and I shall listen with great interest to my hon. Friend the Member for Folkestone and Hythe, who played an extremely important role in uncovering what went on with Cambridge Analytica.
The principle that underpinned what happened in the Cambridge Analytica scandal was the connection of Facebook profiles to the electoral register. If I understand my right hon. Friend the Minister correctly, what he is talking about would not necessarily change that situation. This could be information that the political campaign has gained anyway from a voter profile or from information that already exists in accounts it has access to on platforms such as Facebook; it would simply be attaching that, for the purposes of targeting, to people who voted in an election. The sort of personal data that Members of Parliament hold for the purposes of completing casework would not have been processed in that way. These proposals would not change in any way the ability to safeguard people’s data, and companies such as Cambridge Analytica will still seek other sources of open public data to complete their work.
I think my hon. Friend is right. I have no doubt that we will go into these matters in more detail when we get to those provisions. As the hon. Member for Barnsley East knows, this measure makes a very narrow change to simply extend the existing time limit within which there is protection for elected representatives to conclude casework following a general election. As we will have opportunity in due course to look at the democratic engagement exemption, I hope she will be willing to support these narrow provisions.
I am grateful for the Minister’s reassurance, and we are happy to support them.
I beg to move amendment 69, in clause 6, page 9, leave out lines 7 to 20.
This amendment would remove the ability of the Secretary of State to amend Annex 2, so they could not make changes through secondary legislation to the way purpose limitation operates.
One of the key principles in article 5 of the EU GDPR is purpose limitation. The principle aims to ensure that personal data is collected by controllers only for specified, explicit and legitimate purposes. Generally speaking, it ensures that the data is not further processed in a manner that is incompatible with those purposes. If a controller’s purposes change over time, or they want to use data for a new purpose that they did not originally anticipate, they can go ahead only if the new purpose is compatible with the original purpose, they get the individual’s specific consent for the new purpose or they can point to a clear legal provision requiring or allowing the new processing in the public interest.
Specifying the reasons for obtaining data from the outset helps controllers to be accountable for their processing and helps individuals understand how their data is being used and whether they are happy with that, particularly where they are deciding whether to provide consent. Purpose limitation exists so that it is clear why personal data is being collected and what the intention behind using it is.
In any circumstance where we water down this principle, we reduce transparency, we reduce individuals’ ability to understand how their data will be used and, in doing so, we weaken assurances that people’s data will be used in ways that are fair and lawful. We must therefore think clearly about what is included in clause 6 and the associated annex. Indeed, many stakeholders, from Which? to Defend Digital Me, have expressed concern that what is contained in annex 2 could seriously undermine the principle of purpose limitation.
As Reset.tech illustrates, under the current regime, if data collected for a relatively everyday purpose, such as running a small business, is requested by a second controller for the purpose of investigating crime, the small business would need to assess whether this further processing—thereby making a disclosure of the data—was compatible with its original purpose. In many cases, there will be no link between the original and secondary purposes, and there are potential negative consequences for the data subjects. As such, the further processing would be unlawful, as it would breach the principle of purpose limitation.
However, under the new regime, all it would take for the disclosure to be deemed compatible with the original purpose is the second controller stating that it requires the data for processing in the public interest. In essence, this means that, for every item listed in annex 2, there are an increased number of circumstances in which data subjects’ personal information could be used for purposes outside their reasonable expectations. It seems logical, therefore, that whatever is contained in the list is absolutely necessary for the public good and is subject to the highest level of public scrutiny possible.
Instead, the clause gives the Secretary of State new Henry VIII powers to add to the new list of compatible purposes by secondary legislation whenever they wish, with no provisions made for consulting on, scrutinising or assessing the impact of such changes. It is important to remember here that secondary legislation is absolutely not a substitute for parliamentary scrutiny of primary legislation. Delegated legislation, as we have discussed, is rarely voted on, and even when it is, the Government of the day will win such a vote if they have a majority.
If there are other circumstances in which the Government think it should be lawful to carry out further processing beyond the original purpose, those should be in the Bill, rather than being left to Ministers to determine at a later date, avoiding the same level of scrutiny.
The Government’s impact assessment says that clarity on the reuse of data could help to fix the market failure caused by information gaps on how purpose limitation works. Providing such clarity is something we could all get behind. However, by giving the Secretary of State sweeping powers fundamentally to change how purpose limitation operates, the clause goes far beyond increasing clarity.
Improved and updated guidance on how the new rules surrounding reusing data work would be far more fruitful in providing clarity than further deregulation in this instance. If Ministers believe there are things missing from the clause and annex, they should discuss them here and now, rather than opening the back door to making further additions afterwards, and that is what the amendment seeks to ensure.
The clause sets out the conditions under which the reuse of personal data for a new purpose is permitted. As the hon. Lady has said, the clause expands on the purpose limitation principle. That key principle of data protection ensures that an individual’s personal data is reused only in ways they might reasonably expect.
The current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. That has led to uncertainty about when controllers can reuse personal data. The clause addresses the existing uncertainty around reusing personal data by setting out clearly when it is permitted. That includes when personal data is being reused for a very different purpose from that for which it was originally collected—for example, when a company might wish to disclose personal data for crime prevention.
The clause permits reuse of personal data by a controller when the new purpose is “compatible”; they get fresh consent; there is a research purpose; UK GDPR is being complied with, such as for anonymisation or pseudonymisation purposes; there is an objective in the public interest authorised by law; and certain specified objectives in the public interest set out in a limited list in schedule 2 are met. I will speak more about that when we come to the amendment and the debate on schedule 2.
The clause contains a power to add or amend conditions or remove conditions added by regulations from that list to ensure it can be kept up to date with any future developments in how personal data should be reused in the public interest. It also sets out restrictions on reusing personal data that the controller originally collected on the basis of consent.
The Government want to ensure that consent is respected to uphold transparency and maintain high data protection standards. If a person gives consent for their data to be processed for a specific purpose, that purpose should be changed without their consent only in limited situations, such as for certain public interest purposes, if it would be unreasonable to seek fresh consent. That acts as a safeguard to ensure that organisations address the possibility of seeking fresh consent before relying on any exemptions.
The restrictions around consent relate to personal data collected under paragraph 1(a) of article 6 of the UK GDPR, which came into force in May 2018. Therefore, they do not apply to personal data processed on the basis of consent prior to May 2018, when different requirements applied. By simplifying the rules on further processing, the clause will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency. I support the clause standing part of the Bill.
Let me turn to amendment 69, which proposes to remove the power set out in the clause to amend the annex in schedule 2. As I have already said, schedule 2 will insert a new annex in the UK GDPR, which sets out certain specific public interest circumstances where personal data reuse is permitted. The list is strictly limited and exhaustive, so a power is needed to ensure that it is kept up to date with any future developments in how personal data is reused for important public interest purposes. That builds on an existing power in schedule 2 to the Data Protection Act 2018, where there is already the ability to make exceptions to the purpose limitation principle via secondary legislation.
The power in the clause also provides the possibility of narrowing a listed objective if there is evidence of any of the routes not being used appropriately. That includes limiting it, by reference, to the lawful ground of the original processing—for example, to prohibit the reuse of data that was collected on the basis of an individual’s consent.
I would like to reassure the hon. Lady that this power will be used only when necessary and in the public interest. That is why the clause contains a restriction on its use; it may be used only to safeguard an objective listed in article 23 of the UK GDPR. Clause 44 of the Bill also requires that the Secretary of State must consult the commissioner, and any other persons as the Secretary of State considers appropriate, before making any regulations.
On that basis, I hope the hon. Lady will accept that the amendment is unnecessary.
The purpose behind our amendment —this speaks to a number of our amendments—is that we disagree with the amount of power being given to the Secretary of State. For that reason, I would like to continue with my amendment.
Question put, That the amendment be made.
I beg to move amendment 71, in schedule 2, page 138, line 16, leave out “states” and insert “confirms”.
This amendment would require a person who needs personal data for a purpose described in Article 6(1)(e) (a task carried out in the public interest or in the exercise of official authority vested in the controller) to confirm, and not merely to state, that they need the data for legitimate purposes.
With this it will be convenient to discuss the following:
Amendment 70, in schedule 2, page 139, line 30, at end insert
“levied by a public authority”.
This amendment would clarify that personal data could be processed as a “legitimate interest” under this paragraph only when the processing is carried out for the purposes of the assessment or collection of a tax or duty or an imposition of a similar nature levied by a public authority.
That schedule 2 be the Second schedule to the Bill.
I will begin by addressing amendment 70, which seeks only to make a wording change so that the annex cannot be misinterpreted. Paragraph 10 of annex 2 outlines that further processing is to be treated as compatible with original purposes
“where the processing is carried out for the purposes of the assessment or collection of a tax or duty or an imposition of a similar nature.”
Which? has expressed concerns that that is much too vaguely worded, especially without a definition of “tax” or “duty” for the purposes of that paragraph, leaving the data open to commercial uses beyond the intention. Amendment 70 would close any potential loopholes by linking the condition to meeting a specific statutory obligation to co-operate with a public authority such as His Majesty’s Revenue and Customs.
Moving on, amendment 71 would correct a similar oversight in paragraph 1 of annex 2, which was identified by the AWO and Reset.tech. Paragraph 1 aims to ensure that processing is treated as compatible with the original purpose when it is necessary for making a disclosure of personal data to another controller that needs to process that data for a task in the public interest or in the exercise of official authority and that has requested that data. However, the Bill says that processing is to be treated as compatible with the original purpose where such a request simply “states” that the other person needs the personal data for the purposes of carrying out processing that is a matter of public task. At very least, those matters should surely be actually true, rather than just stated. Amendment 71 would close that loophole, so that the request must confirm a genuine need for data in completing a task in the public interest or exercising official authority, rather than simply being a statement of need.
Beyond those amendments, I wish only to reiterate the thoughts that I expressed during the debate on clause 6. Everything contained in the annex provides for further processing that is hidden from data subjects and may not be within their reasonable expectations. The reliance on the new annex should therefore be closely monitored to ensure that it is not being exploited, or we risk compromising the purpose limitation principle altogether. Does the Department plan to monitor how the new exemptions on the reuse of data are being relied on?
As we have already discussed with clause 6, schedule 2 inserts a new annex into the UK GDPR. It sets out certain specific public interest circumstances in which personal data reuse is permitted regardless of the purpose for which the data was originally collected—for example, when the disclosure of personal data is necessary to safeguard vulnerable individuals. Taken together, clause 6 and schedule 2 will give controllers legal certainty on when they can reuse personal data and give individuals greater transparency.
Amendment 70 concerns taxation purposes, which are included in the list in schedule 2. I reassure the hon. Member for Barnsley East that the exemption for taxation is not new: it has been moved from schedule 2 to the Data Protection Act 2018. Indeed, the specific language in question goes back as far as 1998. We are not aware of any problems caused by that language.
The inclusion in the schedule of
“levied by a public authority”
would likely cause problems, since taxes and duties can be imposed only by law. Some must be assessed or charged by public authorities, but many become payable as a result of a person’s transactions or circumstances, without any intervention needed except to enforce collection if unpaid. They are not technically levied by a public authority. That would therefore lead to uncertainty and confusion about whether processing for certain important taxation purposes would be permitted under the provision.
I hope to reassure the hon. Lady by emphasising that taxation is not included in the annex 1 list of legitimate interests. That means that anyone seeking to use the legitimate interest lawful ground for that purpose would need to carry out a balancing-of-interests test, unless they were responding to a request for information from a public authority or other body with public tasks set out in law. For those reasons, I am afraid I am unable to accept the amendment, and I hope the hon. Lady will withdraw it.
Amendment 71 relates to the first paragraph in new annex 2 to the UK GDPR, as inserted by schedule 2. The purpose of that provision is to clarify that non-public bodies can disclose personal data to other bodies in certain situations to help those bodies to deliver public interest tasks in circumstances in which personal data might have been collected for a different purpose. For example, it might be necessary for a commercial organisation to disclose personal data to a regulator on an inquiry so that that body can carry out its public functions. The provision is tightly formulated and will permit disclosure from one body to another only if the requesting organisation states that it has a public interest task, that it has an appropriate legal basis for processing the data set out in law, and that the use of the data is necessary to safeguard important public policy or other objectives listed in article 23.
I recognise that the amendment is aimed at ensuring that the requesting organisation has a genuine basis for asking for the data, but suggest that changing one verb in the clause from “state” to “confirm” will not make a significant difference. The key point is that non-public bodies will not be expected to hand over personal data on entirely spurious grounds, because of the safeguards that I described. On that basis, I hope the hon. Lady will withdraw her amendment.
I am reassured by what the Minister said about amendment 70 and am happy not to move it, but I am afraid he has not addressed all my concerns in respect of amendment 71, so I will press it to a vote.
Question put, That the amendment be made.
I beg to move amendment 74, in clause 7, page 10, line 34, at end insert—
“6. Where a controller—
(a) charges a fee for dealing with a request, in accordance with paragraph 2(a), or
(b) refuses to act on a request, in accordance with paragraph 2(b)
the controller must issue a notice to the data subject explaining the reasons why they are refusing to act on the request, or charging a fee for dealing with the request, and informing the subject of their right to make a complaint to the Commissioner and of their ability to seek to enforce this right through a judicial remedy.”
This amendment would oblige controllers to issue a notice to the data subject explaining the reasons why they are not complying with a request, or charging for a request, their right to make a complaint to the ICO, and their ability to seek to enforce this right through a judicial remedy.
With this it will be convenient to discuss the following:
Amendment 73, in clause 7, page 12, line 20, at end insert—
“(1A) When considering the resources available to the recipient for the purposes of subsection (1)(c), no account may be taken of any lack of resources which is due to a failure by the recipient to appoint staff to relevant roles where the recipient has the resources to do so.”
This amendment would make it clear that, when taking into account “resources available to the controller” for deciding whether a subject access request is vexatious or excessive, this cannot include where the organisation has neglected to appoint staff, but has the finances or resources to do so.
Amendment 72, in clause 7, page 12, line 25, at end insert—
“(3) The Commissioner must prepare a code of practice under section 124A on the circumstances in which a request may be deemed vexatious or excessive.
(4) The code of practice prepared under subsection (3) must include examples of requests which may be deemed vexatious or excessive, and of requests which may be troublesome to deal with but which should not be deemed vexatious or excessive.”
This amendment would require the ICO to produce a code of practice on how the terms vexatious and excessive are to be applied, with examples of the kind of requests that may be troublesome to deal with, but are neither vexatious nor excessive.
Clause stand part.
I will speak first to clause 7 and amendment 72. Currently, everyone has the right to ask an organisation whether or not it is using or storing their personal data and to ask for copies of that data. That is called the right of access, and exercising that right is known as making a subject access request. Stakeholders from across the spectrum, including tech companies and civil society organisations, all recognise the value of SARs in helping individuals to understand how and why their data is being used and enabling them to hold controllers to account in processing their data lawfully.
The right of access is key to transparency and often underpins people’s ability to exercise their other rights as data subjects. After all, how is someone to know that their data is being used in an unlawful way, or in a way they would object to, if they are not able to ascertain whether their personal data is being held or processed by any particular organisation? For example, as the TUC highlighted in oral evidence to the Committee, the right of data subjects to make an information access request is a particularly important process for workers and their representatives, as it enables workers to gain access to personal data on them that is held by their employer and aids transparency over how algorithmic management systems operate.
It has pleased many across the board to see the Government roll back on their suggestion of introducing a nominal fee for subject access requests. However, the Bill introduces a new threshold for when controllers are able to charge a reasonable fee, or refuse a subject access request, moving from “manifestly unfounded or excessive” to “vexatious or excessive”. When deciding whether a request is vexatious or excessive, the Bill requires the controller to have regard to the circumstances of the subject access request. That includes, but is not limited to, the nature of the request; the relationship between subject and controller; the resources available to the controller; the extent to which the request repeats a previous request made by the subject; how long ago any previous request was made; and whether the request overlaps with other requests made by the data subject to the controller.
Stakeholders such as the TUC, the Public Law Project and Which? have expressed concerns that, as currently drafted, the terms that make up the new threshold are too subjective and could be open to abuse by controllers who may define any request they do not want to answer as vexatious or excessive. Currently, all there is in the Bill to guide controllers on how to apply the threshold is a non-exhaustive list of considerations; as I raised on Second Reading, if that list is non-exhaustive, what explicit protections will be in place to stop the application of terms such as “vexatious” and “excessive” being stretched and manipulated by controllers who simply do not want to fulfil the requests they do not like?
There are concerns that without further guidance even the considerations listed could be interpreted selfishly by controllers who lack a desire to complete a request. For example, given that many subject access requests come from applicants who are suspicious of how their data is being used, or have cause to believe their data is being misused, there is a high likelihood that the relationship any given applicant has with the controller has previously involved some level of friction and, perhaps, anger. The Bill prompts controllers to consider their relationship with a data subject when determining whether their request is vexatious; what is to stop a controller simply marking any data subject who has shared suspicions as “angry and vexatious”, thereby giving them grounds to refuse a genuine request?
Without clarity on how both the new threshold and the considerations apply, the ability of data subjects to raise a legal complaint about why their request was categorised as vexatious and excessive will be severely impeded. As AWO pointed out in oral evidence, that kind of legal dispute over a subject access request may be only the first stage of court proceedings for an individual, with a further legal case on the contents of the subject access request potentially coming afterwards. There simply should not be such a long timescale and set of legal proceedings in order for a person to exercise their fundamental data rights. Even the Information Commissioner himself, despite saying that he was clear on how the phrases “vexatious” and “excessive” should be applied, mentioned to the Committee that it was right to point out that such phrases were open to numerous interpretations.
The ICO is in a great position to provide clear statutory guidance on the application of the terms, with specific examples of when they do and do not apply, so that only truly bad-natured requests that are designed to exploit the system can be rejected or charged for. Such guidance would provide clarity on the ways in which a request might be considered troublesome but neither vexatious nor excessive. That way, controllers can be sure that they have dismissed, or charged for, only requests that genuinely pass the threshold, and data subjects can be assured that they will still be able to freely access information on how their data is being used, should they genuinely need or want it.
On amendment 73, one consideration that the Bill suggests controllers rely on when deciding whether a request is vexatious or excessive is the “resources available” to them. I assume that consideration is designed to operate in relation to the “excessive” threshold and the ability to charge. For example, when a subject access request would require work far beyond the means of the controller in question, the controller would be able to charge for providing the information needed, to ensure that they do not experience a genuine crisis of resources as a result of the request. However, the Bill does not explicitly express that, meaning the consideration in its vague form could be applied in circumstances beyond that design.
Indeed, if a controller neglected to appoint an appropriate number of staff to the responsibility of responding to subject access requests, despite having the finances and resources to do so, they could manipulate the consideration to say that any request they did not like was excessive, as a result of the limited resources available to respond. As is the case across many parts of the Bill, we cannot have legislation that simply assumes that people will act in good faith; we must instead have legislation that explicitly protects against bad-faith interpretations. The amendment would ensure just that by clarifying that a controller cannot claim that a request is excessive simply because they have neglected to arrange their resources in such a way that makes responding to the request possible.
On amendment 74, as is the case with the definition of personal data in clause 1, where the onus is placed on controllers to decide whether a living individual could reasonably be identified in any dataset, clause 7 again places the power—this time to decide whether a request is vexatious or excessive—in the hands of the controller.
As the ICO notes, transparency around the use of data is fundamentally linked to fairness, and is about being
“clear, open and honest with people from the start about who you are, and how and why you use their personal data”.
If a controller decides, then, that due to a request being vexatious or excessive they cannot provide transparency on how they are processing an individual’s data at that time, the very least they could do, in the interests of upholding fairness, is to provide transparency on their justification for classifying a request in that way. The amendment would allow for just that, by requiring controllers to issue a notice to the data subject explaining the grounds on which their request has been deemed vexatious or excessive and informing them of their rights to make a complaint or seek legal redress.
In oral evidence, the Public Law Project described the Bill’s lack of a requirement for controllers to notify subjects as to why their request has been rejected as a decision that creates an “information asymmetry”. That is particularly concerning given that it is often exactly that kind of information that is needed to access the other rights and safeguards outlined in the Bill and across GDPR. A commitment to transparency, as the amendment would ensure, would not only give data subjects clarity on why their request had been rejected or required payment, but provide accountability for controllers who rely on the clause, and thereby a deterrent from misusing it to reject any requests that they dislike. For controllers, the workload of issuing such notices should surely be less than that of processing a request that is genuinely vexatious and excessive, ensuring that the provision does not counterbalance the benefits brought to controllers through the clause.
Let me start by recognising the importance of of subject access requests. I am aware that some have interpreted the change in the wording for grounds of refusal as a weakening. We do not believe that is the case.
On amendment 72, in our view the new “vexatious or excessive” language in the Bill gives greater clarity than there has previously been. The Government have set out parameters and examples in the Bill that outline how the term “vexatious” should be interpreted within a personal data protection context, to ensure that controllers understand.
Does my right hon. Friend agree that the provisions will be helpful and important for organisations that gather data about public persons, and particularly oligarchs, who are very adept at using subject access requests to bombard and overwhelm a journalist or a small investigatory team that is doing important work looking into their business activities?
I completely agree with my hon. Friend. That is an issue that both he and I regard as very serious, and is perhaps another example of the kind of legal tactic that SLAPPs—strategic lawsuits against public participation—represent, whereby oligarchs can frustrate genuine journalism or investigation. He is absolutely right to emphasise that.
It is important to highlight that controllers can already consider resource when refusing or charging a reasonable fee for a request. The Government do not wish to change that situation. Current ICO guidance sets out that controllers can consider resources as a factor when determining if a request is excessive.
The new parameters are not intended to be reasons for refusal. The Government expect that the new parameters will be considered individually as well as in relation to one another, and a controller should consider which parameters may be relevant when deciding how to respond to a request. For example, when the resource impact of responding would be minimal even if a large amount of information was requested—such as for a large organisation—that should be taken into account. Additionally, the current rights of appeal allow a data subject to contest a refusal and ultimately raise a complaint with the ICO. Those rights will not change with regard to individual rights requests.
Amendment 74 proposes adding more detail on the obligations of a controller who refuses or charges for a request from a data subject. The current legislation sets out that any request from a data subject, including subject access requests, is to be responded to. The Government are retaining that approach and controllers will be expected to demonstrate why the provision applies each time it is relied on. The current ICO guidance sets out those obligations on controllers and the Government do not plan to suggest a move away from that approach.
The clause also states that it is for the controller to show that a request is vexatious or excessive in circumstances where that might be in doubt. Thus, the Government believe that the existing legislation provides the necessary protections. Following the passage of the Bill, the Government will work with the ICO to update guidance on subject access requests, which we believe plays an important role and is the best way to achieve the intended effect of the amendments. For those reasons, I will not accept this group of amendments; I hope that the hon. Member for Barnsley East will be willing to withdraw them.
I turn to clause 7 itself. As I said, the UK’s data protection framework sets out key data subject rights, including the right of access—the right for a person to obtain a copy of their personal data. A subject access request is used when an individual requests their personal data from an organisation. The Government absolutely recognise the importance of the right of access and do not want to restrict that right for reasonable requests.
The existing legislation enables organisations to refuse or charge a reasonable fee for a request when they deem it to be “manifestly unfounded or excessive”. Some organisations, however, struggle to rely on that in cases where it may be appropriate to do so, which as a consequence impacts their ability to respond to reasonable requests.
The clause changes the legislation to allow controllers to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. The clause adds parameters for controllers to consider when relying on the “vexatious or excessive” exemption, such as the nature of the request and the relationship between the data subject and the controller. The clause also includes examples of the types of request that may be vexatious, such as those intended to cause distress, those not made in good faith or those that are an abuse of process.
We believe that the changes will give organisations much-needed clarity over when they can refuse or charge a reasonable fee for a request. That will ensure that controllers can focus on responding to reasonable requests, as well as other important data and organisational needs. I commend the clause to the Committee.
I appreciate that, as the Minister said, the Government do not intend the new terms to be grounds for refusal, but his remarks do not reassure me that that will not be the case. Furthermore, as I said on moving the amendment, stakeholders such as the TUC, Public Law and Which? have all expressed concern that, as drafted, those terms are too subjective. I will press the amendment to a vote.
Question put, That the amendment be made.
Clause 8 makes changes to the time requirements to which an organisation must adhere when responding to a subject access request. Currently, organisations must respond to a subject access request within a set period; in the majority of cases, that is one month from receipt of the request. This clause enables organisations to “stop the clock” on the response time when an organisation is unable to respond without further information or clarification from an individual. For example, when the controller has information on multiple data subjects with the same name, they may require further information to help to differentiate the data subject’s information from others’. Organisations must have a legitimate reason to pause the response time; once confirmation is received from the data subject, the original time obligations resume.
The clause will also enable organisations to extend the period permitted for law enforcement and the intelligence services to respond to complex requests by two further months in certain circumstances. This replicates the existing provisions applicable to processing requests under the UK GDPR. Currently, all subject access requests received under the law enforcement and intelligence services regimes must be actioned within one month, irrespective of the complexity or number of requests received from an individual. Consequently, complex or confusing requests can disproportionately burden public bodies operating under those regimes, creating resource pressures.
Clause 8 will rectify the disparity currently existing between processing regimes and put law enforcement and intelligence services organisations on an equal footing to UK GDPR organisations. That will also provide a consistent framework for organisations operating under more than one regime at the same time. The clause also brings clarity on how best to respond to a confusing or complex request, ensuring that organisations do not lose time while seeking this clarification and can instead focus on responding to a request. On that basis, I urge that clause 8 stand part of the Bill.
I expressed my thoughts on the value and importance of subject access requests when we debated clause 7, and most of the same views remain pertinent here. Clause 8 allows for subject access requests to be extended where the nature of the request is complex, or due to volume. Some civil society groups, including Reset.tech, have expressed concern that that could mean that requests are unduly delayed for months, reflecting concern that they could be disregarded altogether, which was discussed when we debated clause 7. With that in mind, can the Minister tell us what protections will be in place to ensure that data controllers do not abuse the new ability to extend subject access requests, particularly by using the excuse that it is a large amount of data, in order to delay requests that they simply do not wish to respond to?
The clause provides some clarity on clause 7 by demonstrating that just because a request is lengthy or comes in combination with many others, it is not necessarily excessive as the clause gives controllers the option to extend the timeframe for dealing with requests that are high in volume. Of course, we do not want to unnecessarily delay requests, but allowing controllers to manage their load within a reasonable extended timeframe can act as a safeguard against their automatically relying on the “excessive” threshold. With that in mind, I am happy for the clause to stand part. However, I reiterate that my comments on clause 7 should be referred to.
May I briefly respond to the hon. Lady’s comments? I assure her that controllers will not be able to stop the clock for all subject access requests—only for those where they reasonably require further information to be able to proceed with responding. Once that information has been received from a data subject, the clock resumes and the controller must proceed with responding to the request within the applicable time period, which is usually one month from when the controller receives the request information. A data subject who has provided the requested information would also be able to complain to a controller, and ultimately to the Information Commissioner’s Office, if they feel that their request has not been processed within the appropriate time. I hope the hon. Lady will be assured that there are safeguards to ensure that this power is not abused.
Question put and agreed to.
Clause 8 accordingly ordered to stand part of the Bill.
Clause 9
Information to be provided to data subjects
Question proposed, That the clause stand part of the Bill.
Clause 9 provides researchers, archivists and those processing personal data for statistical purposes with a new exemption from providing certain information to individuals when they are reusing datasets for a different purpose, which will help to ensure that important research can continue unimpeded. The new exemption will apply when the data was collected directly from the individual, and can be used only when providing the additional information would involve a disproportionate effort. There is already an exemption from this requirement where the personal data was collected from a different source.
The clause also adds a non-exhaustive list of examples of factors that may constitute a disproportionate effort. This list is added to both the new exemption in article 13 and the existing exemption found in article 14. Articles 13 and 14 of the UK GDPR set out the information that must be provided to data subjects at the point of data collection: article 13 covers circumstances where data is directly collected from data subjects, and article 14 covers circumstances where personal data is collected indirectly—for example, via another organisation. The information that controllers must provide to individuals includes details such as the identity and contact details of the controller, the purposes of the processing and the lawful basis for processing the data.
Given the long-term nature of research, it is not always possible to meaningfully recontact individuals. Therefore, applying a disproportionate effort exemption addresses the specific problem of researchers wishing to reuse data collected directly from an individual. The exemption will help ensure that important research can continue unimpeded. The clause also makes some minor changes to article 14. Those do not amend the scope of the exemption or affect its operation, but make it easier to understand.
I now turn to clause 10, which introduces an exemption relating to legally professionally privileged data into the law enforcement regime, mirroring the existing exemptions under the UK GDPR and the intelligence services regime. As a fundamental principle of our legal system, legal professional privilege protects confidential communications between professional legal advisers and their clients. The existing exemption in the UK GDPR restricts an individual’s right to access personal data that is being processed or held by an organisation, and to receive certain information about that processing.
However, in the absence of an explicit exemption, organisations processing data under the law enforcement regime, for a law enforcement purpose rather than under the UK GDPR, must rely on ad hoc restrictions in the Data Protection Act. Those require them to evaluate and justify its use on a case-by-case basis, even where legal professional privilege is clearly applicable. The new exemption will make it simpler for organisations that process data for a law enforcement purpose to exempt legally privileged information, avoiding the need to justify the use of alternative exemptions. It will also clarify when such information can be withheld from the individual.
Hon. Members might wonder why an exemption for legal professional privilege was not included under the law enforcement regime of the Data Protection Act in the first place. The reason is that we faithfully transposed the EU law enforcement directive, which did not contain such an exemption. Following our exit from the EU, we are taking this opportunity to align better the UK GDPR and the law enforcement regime, thereby simplifying the obligations for organisations and clarifying the rules for individuals.
The impact of clause 9 and the concerns around it should primarily be understood in relation to the definition contained in clause 2, so I refer hon. Members to my remarks in the debate on clause 2. I also refer them to my remarks on purpose limitation in clause 6. To reiterate both in combination, I should say that purpose limitation exists so that it is clear why personal data is being collected, and what the intention is behind its use. That means that people’s data should not largely be reused in ways not initially collected for, unless a new legal basis is obtained.
It is understandable that, where genuine scientific, historical and statistical research is occurring, and there is disproportionate effort to provide the information required to data subjects, there may be a need for exemption and to reuse data without informing the subject. However, that must be done only where strictly necessary. We must be clear that, unless there are proper boundaries to the definition of scientific data, this could be interpreted far too loosely.
I am concerned that, without amendment to clause 2, clause 9 could extend the problem of scientific research being used as a guise for using people’s personal data in malicious or pseudoscientific ways. Will the Minister tell us what protections will be in place to ensure that people’s data is not reused on scientific grounds for something that they would otherwise have objected to?
On clause 10, I will speak more broadly on law enforcement processing later in the Bill, but it is good to have clarity on the legal professional privilege exemptions. I have no further comments at this stage.
What we are basically doing is changing the rights of individuals, who would previously have known when their data was used for a purpose other than that for which it was collected. The terms
“scientific or historical research, the purposes of archiving in the public interest or statistical purposes”
are very vague, and, according to the Public Law Project, open to wide interpretation. Scientific research is defined as
“any research that can reasonably described as scientific, whether publicly or privately funded”.
I ask the Minister: what protections are in place to ensure that private companies are not given, through this clause, a carte blanche to use personal data for the purpose of developing new products, without the need to inform the data subject?
These clauses relate to one of the fundamental purposes of the Bill, which is to facilitate genuine scientific research—obviously, that carries with it huge potential benefits in the areas of tackling disease or other scientific advances. We debated the definition of scientific research earlier in relation to clause 2. We believe that the definition is clear. In this particular case, the use of historical data can be very valuable. It is simply impractical for some organisations to reobtain consent when they may not even know where original data subjects are now located.
Order. I apologise to the Minister. He can resume his remarks at 2 o’clock, when we meet again in this room but, it being 11.25 am, the Committee is now adjourned.
(1 year, 6 months ago)
Public Bill CommitteesWhen the Committee adjourned this morning, I was nearly at my conclusion; I was responding to points made by the hon. Member for Barnsley East and by the hon. Member for Glasgow North West, who has not yet rejoined us. I was saying that the exemption applies where the data originally collected is historic, where to re-contact to obtain consent would require a disproportionate effort, and where that data could be of real value in scientific research. We think that there is a benefit to research and we are satisfied that the protection is there. There was some debate about the definition of scientific research, which we covered earlier; that is a point that is appealable to the Information Commissioner’s Office. On the basis of what I said earlier, and that assurance, I hope that the Committee will agree to the clause.
Question put and agreed to.
Clause 9 accordingly ordered to stand part of the Bill.
Clause 10 ordered to stand part of the Bill.
Clause 11
Automated decision-making
I beg to move amendment 78, in clause 11, page 18, line 13, after “subject” insert “or decision subject”.
This amendment, together with Amendments 79 to 101, would apply the rights given to data subjects by this clause to decision subjects (see NC12).
With this it will be convenient to discuss the following:
Amendment 79, in clause 11, page 18, line 15, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 80, in clause 11, page 18, line 16, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 81, in clause 11, page 18, line 27, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 82, in clause 11, page 18, line 31, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 83, in clause 11, page 19, line 4, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 84, in clause 11, page 19, line 7, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 85, in clause 11, page 19, line 11, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 86, in clause 11, page 19, line 12, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 87, in clause 11, page 19, line 13, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 88, in clause 11, page 19, line 15, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 89, in clause 11, page 19, line 17, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 90, in clause 11, page 19, line 26, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 91, in clause 11, page 20, line 8, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 92, in clause 11, page 20, line 10, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 93, in clause 11, page 20, line 12, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 94, in clause 11, page 20, line 23, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 95, in clause 11, page 20, line 28, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 96, in clause 11, page 20, line 31, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 97, in clause 11, page 20, line 35, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 98, in clause 11, page 20, line 37, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 99, in clause 11, page 20, line 39, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 100, in clause 11, page 21, line 1, leave out “data”.
See explanatory statement to Amendment 78.
Amendment 101, in clause 11, page 21, line 31, after “subject” insert “or decision subject”.
See explanatory statement to Amendment 78.
Amendment 106, in clause 27, page 47, line 27, after “subjects”, insert “decision subjects,”.
This amendment would require the ICO to have regard to decision subjects (see NC12) as well as data subjects as part of its obligations.
Amendment 108, in clause 29, page 53, line 11, at end insert—
“(ba) decision subjects;”.
This amendment, together with Amendments 109 and 110, would require codes of conduct produced by the ICO to have regard to decision subjects (see NC12) as well as data subjects.
Amendment 109, in clause 29, page 53, line 13, at end insert—
“(d) persons who appear to the Commissioner to represent the interests of decision subjects.”.
See explanatory statement to Amendment 108.
Amendment 110, in clause 29, page 53, line 21, after “subjects”, insert “, decision subjects”.
See explanatory statement to Amendment 108.
New clause 12—Decision subjects—
“(1) The UK GDPR is amended as follows.
(2) In Article 4, after paragraph (A1), insert—
‘(A1A) “decision subject” means an identifiable individual who is subject to data-based and automated decision making;’”.
This new clause would provide a definition of “decision subjects”, enabling them to be given rights similar to those given to data subjects (see, for example, Amendment 78).
I am pleased to speak to new clause 12, which would insert a definition of decision subjects, and to amendments 79 to 101, 106 and 108 to 110, which seek to insert rights and considerations for decision subjects that mirror those of data subjects at various points throughout the Bill.
Most of our data protection legislation operates under the assumption that the only people affected by data-based and automated decision making are data subjects. The vast majority of protections available for citizens are therefore tied to being a data subject: an identifiable living person whose data has been used or processed. However, as Dr Jeni Tennison described repeatedly in evidence to the Committee, that assumption is unfortunately flawed. Although data subjects form the majority of those affected by data-based decision making, they are not the only group of people impacted. It is becoming increasingly common across healthcare, employment, education and digital platforms for algorithms created and trained on one set of people to be used to reach conclusions about another, wider set of people. That means that an algorithm can make an automated decision that affects an individual to a legal or similarly significant degree without having used their personal data specifically.
For example, as Connected by Data points out, an automated decision could be made about a neighbourhood area, such as a decision on gritting or a police patrol route, based on personal data about some of the people who live in that neighbourhood, with the outcome impacting even those residents and visitors whose data was not directly used. For those who are affected by the automated decision but are not data subjects, there is currently no protection, recognition or method of redress.
The new clause would therefore define the decision subjects who are impacted by the likes of AI without their data having been used, in the hope that we can give them protections throughout the Bill that are equal to those for data subjects, where appropriate. That is especially important because special category data is subject to stricter safeguards for data subjects but not for decision subjects.
Connected by Data illustrates that point using the following example. Imagine a profiling company that uses special category data about the mental health of some volunteers to construct a model that predicts mental health conditions based on social media feeds, which would not be special category data. From that information, the company could give an estimate of how much time people are likely to take off work. A recruitment agency could then use that model to assess candidates and reject those who are likely to have extended absences. The model would never use any special category data about the candidates directly, but those candidates would have been subject to an automated decision that made assumptions about their own special category data, based on their social media feeds. In that scenario, by virtue of being a decision subject, the individual would not have the right to the same safeguards as those who were data subjects.
Furthermore, there might be scenarios in which someone was subject to an automated decision despite having consciously prevented their personal data from being shared. Connected by Data illustrates that point by suggesting that we consider a person who has set their preferences on their web browser so that it does not retain tracking cookies or share information such as their location when they visit an online service. If the online service has collected data about the purchasing patterns of similarly anonymous users and knows that such a customer is willing to pay more for the service, it may automatically provide a personalised price on that basis. Again, no personal data about the purchaser will have been used in determining the price that they are offered, but they will still be subject to an automated decision based on the data of other people like them.
What those scenarios illustrate is that it is whether an automated decision affects an individual in a legal or similarly significant way that should be central to their rights, rather than whether any personal data is held about them. If the Bill wants to unlock innovation around AI, automated decisions and the creative use of data, it is only fair that that be balanced by ensuring that all those affected by such uses are properly protected should they need to seek redress.
This group of amendments would help our legislative framework to address the impact of AI, rather than just its inputs. The various amendments to clause 11 would extend to decision subjects rights that mirror those given to data subjects regarding automated decision making, such as the right to be informed, the right to safeguards such as contesting a decision and the right to seek human intervention. Likewise, the amendments to clauses 27 and 29 would ensure that the ICO is obliged to have regard to decision subjects both generally and when producing codes of conduct.
Finally, to enact the safeguards to which decision subjects would hopefully be entitled via the amendments to clause 11, the amendment to clause 39 would allow decision subjects to make complaints to data controllers, mirroring the rights available to data subjects. Without defining decision subjects in law, that would not be possible, and members of the general public could be left without the rights that they deserve.
I am very much aware of the concern about automated decision making. The Government share the wish of the hon. Member for Barnsley East for all those who may be affected to be given protection. Where I think we differ is that we do not recognise the distinction that she tries to make between data subjects and decision subjects, which forms the basis of her amendments.
The hon. Lady’s amendments would introduce to the UK GDPR a definition of the term “decision subject”, which would refer to an identifiable individual subject to data- based and automated decision making, to be distinguished from the existing term “data subject”. The intended effect is to extend the requirements associated with provisions related to decisions taken about an individual using personal data to those about whom decisions are taken, even though personal information about them is not held or used to take a decision. It would hence apply to the safeguards available to individuals where significant decisions are taken about them solely through automated means, as amendments 78 to 101 call for, and to the duties of the Information Commissioner to have due regard to decision subjects in addition to data subjects, as part of the obligations imposed under amendment 106.
I suggest to the hon. Lady, however, that the existing reference to data subjects already covers decision subjects, which are, if you like, a sub-group of data subjects. That is because even if an individual’s personal data is not used to inform the decision taken about them, the fact that they are identifiable through the personal data that is held makes them data subjects. The term “data subject” is broad and already captures the decision subjects described in the hon. Lady’s amendment, as the identification of a decision subject would make them a data subject.
I will not, at this point, go on to set out the Government’s wider approach to the use of artificial intelligence, because that is somewhat outside the scope of the Bill and has already been set out in the White Paper, which is currently under consultation. Nevertheless, it is within that framework that we need to address all these issues.
I have been closely following the speeches of the Minister and the hon. Member for Barnsley East. The closest example that I can think of for this scenario is the use of advertising tools such as lookalike audiences on Facebook and customer match on YouTube, where a company holding data about users looks to identify other customers who are the closest possible match. It does not hold any personal data about those people, but the platform forms the intermediary to connect them. Is the Minister saying that in that situation, as far as the Bill is concerned, someone contacted through a lookalike audience has the same rights as someone who is contacted directly by an advertiser that holds their data?
Essentially, if anybody is affected by automated decision making on the basis of the characteristics of another person whose data is held—in other words, if the same data is used to take a decision that affects them, even if it does not personally apply to them—they are indeed within the broader definition of a data subject. With that reassurance, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.
I appreciate the Minister’s comments, but the point is that the data could be used—I gave the example that it might affect a group of residents who were not identifiable but were still subject to that data—so I am not quite sure that I agree with the Minister’s comparison. As the use of automated decision making evolves and expands, it is crucial that even if a person’s data is not being used directly, they are afforded protections and rights if they are subject to the outcome. I would like to press my amendment to a vote.
Question put, That the amendment be made.
I beg to move amendment 77, in clause 11, page 19, line 12, at end insert
“and about the safeguards available to the subject in accordance with this paragraph and any regulations under Article 22D(4);”.
This amendment would require controllers proactively to provide data subjects with information about their rights in relation to automated decision-making.
With this it will be convenient to discuss amendment 120, in clause 11, page 19, line 12, at end insert—
“(aa) require the controller to inform the data subject when a decision described in paragraph 1 has been taken in relation to the data subject;”.
This amendment would require a data controller to inform a data subject whenever a significant decision about that subject based entirely or partly on personal data was taken based solely on automated processing.
New article 22C of the UK GDPR, inserted by clause 11, sets out the safeguards available to those who are subject to automated decision making. One such safeguard is that controllers must provide information to subjects relating to significant decisions taken through solely automated processing. That includes notifying subjects when a decision has been taken or informing them of the logic involved in producing that decision.
That provision is important. After all, how can the subject of an automated decision possibly exercise their other rights surrounding that decision if they do not even know that it has been taken on a solely automated basis? By the same logic, however, the average member of the general public is not likely to be aware of those other rights in the first place, including the rights to express their point of view with respect to automated decisions, to contest them and to seek human intervention.
Amendment 77 therefore recommends that as well as controllers being required to inform subjects about the decision, the same notice should be used as a vehicle to ensure that the subject is aware of the rights and safeguards in place to protect them and offer them redress. It would require no extra administrative effort on behalf of the controllers, because they will already be informing subjects. A proactive offer of redress may also encourage controllers to have extra regard to the way in which their automated systems are operating, in order to avoid unlawful activity that may cause them to receive a complaint or a request for human intervention.
An imbalance of power between those who conduct automated decisions and those who are subject to them already largely exists. Those who conduct decisions hold the collective power of the data, whereas each individual subject to a decision has only their own personal information; I will address that issue in greater detail in relation to other amendments, but there is no reason why that power imbalance should be exacerbated by hiding an individual’s own rights from them. If the intention of new article 22C is, as stated, to ensure that controllers are required to review and correct decisions that have produced a systematically wrongful outcome, there should be no issue with ensuring that the mechanism is properly communicated to the people it purports to serve. I am pleased to see that the hon. Member for Glasgow North West has tabled a similar amendment.
I rise to speak to my amendment 120. The explanatory notes to the Bill clarify that newly permitted automated decisions will not require the existing legal safeguard of notification, stating only:
“Where appropriate, this may include notifying data subjects after such a decision has been taken”.
Clause 11 would replace article 22 of the GDPR, which regulates AI decision making, with new articles 22A to 22D. According to Connected by Data, it is built on the faulty assumption that the people who are affected by automated decision making are data subjects—identifiable individuals within the data used to make the automated decision. However, now that AI decisions can be based on information about other people, it is becoming increasingly common for algorithms created through training on one set of people to be used to reach conclusions about another set.
A decision can be based on seemingly innocuous information such as someone’s postcode or whether they liked a particular tweet. Where such a decision has an impact on viewing recommendations for an online player, we would probably not be that concerned, but personal data is being used more and more to make decisions that affect whole groups of people rather than identified individuals. We need no reminding of the controversy that ensued when Ofqual used past exam results to grade students during the pandemic.
Another example might be an electricity company getting data from its customers about home energy consumption. Based on that data, it could automatically adjust the time of day at which it offered cheaper tariffs. Everyone who used the electricity company would be affected, whether data about their energy consumption patterns were used to make the decision or not. It is whether an automated decision has a legal or similarly significant effect on an individual that should be relevant to their rights around automated decision making.
Many of the rights and interests of decision subjects are protected through the Equality Act 2010, as the Committee heard in oral evidence last week. What is not covered by other legislation, however, is how data can be used in automated decisions and the rights of decision subjects to be informed about, control and seek redress around automated decisions with a significant effect on them. According to Big Brother Watch:
“This is an unacceptable dilution of a critical safeguard that will not only create uncertainty for organisations seeking to comply, but could lead to vastly expanded ADM operating with unprecedented opacity.”
Amendment 120 would require a data controller to inform a data subject whenever a significant decision about that subject was based solely on automated processing. I am pleased that the hon. Member for Barnsley East has tabled a similar amendment, which I support.
The Government absolutely share hon. Members’ view of the importance of transparency. We agree that individuals who are subject to automated decision making should be made aware of it and should have information about the available safeguards. However, we feel that those requirements are already built into the Bill via article 22C, which will ensure that individuals are provided with information as soon as is practicable after such decisions have been taken. This will need to include relevant information that an individual would require to contest such decisions and seek human review of them.
The reforms that we propose take an outcome-focused approach to ensure that data subjects receive the right information at the right time. The Information Commissioner’s Office will play an important role in elaborating guidance on what that will entail in different circumstances.
If I understood the Minister correctly, he said that decision subjects are a subset of data subjects. Can he envisage any circumstances in which a decision subject is not included within the group “data subjects”?
It is certainly our view that anybody who is affected by an automated decision made on the basis of data held about individuals themselves becomes a data subject, so I think the answer to the honourable Lady’s question is no. As I said, the Information Commissioner’s Office will provide guidance in this area. If such a situation does arise, obviously it will need to be considered.The hon. Members for Barnsley East and for Glasgow North West asked about making information available to all those affected, and about safeguards, which we think are contained within the requirements under article 22C.
Further to the point that was made earlier, let us say that a Facebook user was targeted with an advert that was based on their protected characteristics data—data relevant to their sexual orientation, for example—but that user said that they had never shared that information with the platform. Would they have the right to make a complaint, either to the advertiser or to the platform, for inferring that data about them and making it available to a commercial organisation without their informed consent?
They would obviously have that right, and indeed they would ultimately have the right to appeal to the Information Commissioner if they felt that they had been subjected unfairly to a decision where they had not been properly informed of the fact. On the basis of what I have said, I hope the hon. Member for Barnsley East might withdraw her amendment.
I appreciate the Minister’s comment, but the Government protection does not go as far as we would like. Our amendment speaks to the potential imbalance of power in the use of data and it would not require any extra administrative effort on behalf of controllers. For that reason, I will press it to a vote.
Question put, That the amendment be made.
I will not move it formally, Mr Hollobone, but I may bring it back on Report.
I beg to move amendment 76, in clause 11, page 19, line 34, at end insert—
“5A. The Secretary of State may not make regulations under paragraph 5 unless—
(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),
(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and
(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”
This amendment would make the Secretary of State’s ability to amend the safeguards for automated decision-making set out in new Articles 22A to D subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.
With this it will be convenient to discuss amendment 75, in clause 11, page 19, line 36, at end insert—
“7. The Commissioner must prepare a code of practice under section 124A of the Data Protection Act 2018 on the interpretation of references in this Regulation to “meaningful human involvement” and “similarly significant”.
8. The code of practice prepared under paragraph 7 must include examples of the kinds of processing which do, and which do not, fall within the definitions which use the terms referred to in that paragraph.”
This amendment would require the ICO to produce a code of practice on the interpretation of references to “meaningful human involvement” and “similarly significant” in connection with automated decision-making, with examples of the kinds of processing that would not count as falling within these definitions.
I will begin by discussing amendment 76 in the context of the general principles of this clause. The rise of AI and algorithmic decision making has happened at an unprecedented speed—so much so, in fact, that when the first version of this Bill was published, the likes of ChatGPT were not even launched yet. Now we live in a world where the majority of people across the country have been affected by or have used some form of AI-based or automated decision-making system.
When algorithms and automation work well, not only do they reduce administrative burdens, increase efficiency and free up capacity for further innovation and growth; they can also have remarkable outcomes. Indeed, PwC UK suggests that UK GDP could be up to 10.3% higher in 2030 as a result of artificial intelligence. AI is already being used to develop vaccines and medicines, for example, which are saving lives across the country and the entire world. Labour’s belief, outlined in our industrial strategy, is that the UK should be leading the world on efforts to ensure that transformative AI is aligned with the public interest in that way, and that regulations ensure we are well positioned to do that.
Despite the potential of AI to be harnessed for the public good, however, where things go wrong, the harms can be serious. The first way in which automation is prone to go wrong is by producing discriminatory outcomes. An algorithm, although intelligent in itself, is only ever as fair as the information and the people used to train it. That means that where biases exist in our world, they can become entrenched in our automated systems too. In in 2020, thousands of students in England and Wales received A-level exam results where, due to the pandemic, their grades were determined by an algorithm rather than by sitting an exam. At the hands of the automated system, almost 40% of students received grades lower than they had anticipated, with pupils from certain backgrounds and areas such as those that I represent disproportionately impacted by the lower marks. Within days of the results being published, there was widespread public outcry about the distress caused, as well as threats of mass protests and legal action. Similarly, Amazon was reported to have used an AI tool that systematically penalised women in job application processes. The tool had been trained on a decade’s worth of CVs, predominantly submitted by men. As such examples show, AI on its own can produce discriminatory outcomes. Our regulation must therefore recognise that and seek to protect against it.
The second major way in which automated decision making tends to go wrong, or can be abused, is when it makes legal or critical decisions about our lives based on mismanaged, abused or faulty systems. In the most extreme cases, automated systems can even contribute to deciding whether someone’s employment will be terminated, with grave consequences when that goes wrong. As mentioned in the oral evidence sessions, for example, last month the courts upheld the finding that three UK-based Uber drivers were robotically fired without redress, having been accused of fraudulent activity on the basis of an automated detection system. The court found that human involvement in the firing process was
“not much more than a purely symbolic act”,
and that implementing such a decision without a mechanism for appeal was unjust. Where livelihoods are at risk, data regulation must ensure that proper safeguards are in place to protect against mismanaged and faulty automated systems.
Serious harms sometimes occur under the existing system, but there are laws under the GDPR that try to protect us against discriminatory outcomes and mismanagement. Indeed, article 21 of GDPR gives a data subject the right to object at any time to the processing of their personal data, unless the controller can demonstrate “compelling legitimate grounds” for the processing to override the data subject’s rights. In conjunction, article 22 prevents data subjects from being subject to a decision based solely on automated processing that has significant effects, except in a few circumstances, including when it is based on explicit consent and does not rely on special categories of data. In all cases where automated decision making is allowed, suitable measures to safeguard the data subjects’ rights and freedoms must also be implemented.
Albeit from different perspectives, stakeholders from techUK to the TUC have emphasised the importance of those articles and of the core principles that they promote. For example, the articles place an element of control in the hands of those that an automated decision affects. They emphasise the need for appropriate safeguards, and they consider the need for a different approach where sensitive data is concerned.
Where the clause adjusts the threshold on automated decision making to unlock innovation, therefore—as the likes of the A-level algorithm scandal and the robo- firings show—it is vital that any changes to regulation maintain and in some cases strengthen the principles set out in articles 21 and 22 of the GDPR. However, as the likes of the Ada Lovelace Institute, Which? and the TUC warn, in reality the Bill does the opposite, watering down existing protections. The amendments I have tabled are designed to rectify that.
The hon. Lady began her remarks on the broader question of the ambition to ensure that the UK benefits to the maximum extent from the use of artificial intelligence. We absolutely share that ambition, but also agree that it needs to be regulated. That is why we have published the AI regulation White Paper, which suggests that it is most appropriate that each individual regulator should develop its own rules on how that should apply. I think in the case that she was quoting of those who had lost their jobs, maybe through an automated process, the appropriate regulator—in that case, presumably, the special employment tribunal —would need to develop its own mechanism for adjudicating decisions.
I will concentrate on the amendment. On amendment 76, we feel that clause 44 already provides for an overarching requirement on the Secretary of State to consult the Information Commissioner and other persons that she or he considers appropriate before making regulations under UK GDPR, including the measures in article 22. When the new clause 44 powers are used in reference to article 22 provisions, they will be subject to the affirmative procedure in Parliament. I know that the hon. Lady is not wholly persuaded of the merits of using the affirmative procedure, but it does mean that parliamentary approval will be required. Given the level of that scrutiny, we do not think it is necessary for the Secretary of State to have to publish an assessment, as the hon. Lady would require through her amendment.
On amendment 75, as we have already debated in relation to previous amendments, there are situations where non-statutory guidance, which can be produced without being requested under regulations made by the Secretary of State, may be more appropriate than a statutory code of practice. We believe that examples of the kinds of processing that do and do not fall within the definitions of the terms “meaningful human involvement” and “similarly significant” are best placed in non-statutory guidance produced by the ICO, as this will give the flexibility to amend and change the examples where necessary. What constitutes a significant decision or meaningful human involvement is often highly context-specific, and the current wording allows for some inter-pretability to enable the appropriate application of this provision in different contexts, rather than introducing an absolute definition that risks excluding decisions that ought to fall within this provision and vice versa. For that reason, we are not minded to accept the amendments.
I appreciate the Minister’s remarks about consultation and consulting relevant experts. He is right to observe that I am not a big fan of the affirmative procedure as a method of parliamentary scrutiny but I appreciate that it is included in this Bill as part of that.
I think the problem is that we fundamentally disagree on the power to change these definitions being concentrated in the hands of the Secretary of State. It is one thing to future-proof the Bill but another to allow the Secretary of State alone to amend things as fundamental as the safeguards offered here. I would therefore like to proceed to a vote.
Question put, That the amendment be made.
I beg to move amendment 121, in clause 11, page 19, line 36, at end insert—
“7. When exercising the power to make regulations under this Article, the Secretary of State must have regard to the following statement of principles:
Digital information principles at work
1. People should have access to a fair, inclusive and trustworthy digital environment at work.
2. Algorithmic systems should be designed and used to achieve better outcomes: to make work better, not worse, and not for surveillance. Workers and their representatives should be involved in this process.
3. People should be protected from unsafe, unaccountable and ineffective algorithmic systems at work. Impacts on individuals and groups must be assessed in advance and monitored, with reasonable and proportionate steps taken.
4. Algorithmic systems should not harm workers’ mental or physical health, or integrity.
5. Workers and their representatives should always know when an algorithmic system is being used, how and why it is being used, and what impacts it may have on them or their work.
6. Workers and their representatives should be involved in meaningful consultation before and during use of an algorithmic system that may significantly impact work or people.
7. Workers should have control over their own data and digital information collected about them at work.
8. Workers and their representatives should always have an opportunity for human contact, review and redress when an algorithmic system is used at work where it may significantly impact work or people. This includes a right to a written explanation when a decision is made.
9. Workers and their representatives should be able to use their data and digital technologies for contact and association to improve work quality and conditions.
10. Workers should be supported to build the information, literacy and skills needed to fulfil their capabilities through work transitions.”
This amendment would insert into new Article 22D of the UK GDPR a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.
With this it will be convenient to discuss amendment 122, in clause 11, page 22, line 2, at end insert—
“(7) When exercising the power to make regulations under this section, the Secretary of State must have regard to the following statement of principles:
Digital information principles at work
1. People should have access to a fair, inclusive and trustworthy digital environment at work.
2. Algorithmic systems should be designed and used to achieve better outcomes: to make work better, not worse, and not for surveillance. Workers and their representatives should be involved in this process.
3. People should be protected from unsafe, unaccountable and ineffective algorithmic systems at work. Impacts on individuals and groups must be assessed in advance and monitored, with reasonable and proportionate steps taken.
4. Algorithmic systems should not harm workers’ mental or physical health, or integrity.
5. Workers and their representatives should always know when an algorithmic system is being used, how and why it is being used, and what impacts it may have on them or their work.
6. Workers and their representatives should be involved in meaningful consultation before and during use of an algorithmic system that may significantly impact work or people.
7. Workers should have control over their own data and digital information collected about them at work.
8. Workers and their representatives should always have an opportunity for human contact, review and redress when an algorithmic system is used at work where it may significantly impact work or people. This includes a right to a written explanation when a decision is made.
9. Workers and their representatives should be able to use their data and digital technologies for contact and association to improve work quality and conditions.
10. Workers should be supported to build the information, literacy and skills needed to fulfil their capabilities through work transitions.”
This amendment would insert into new section 50D of the DPA2018 a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.
Amendments 121 and 122 would ensure that close attention is paid to the specific and unique circumstances of workers and the workplace when regulations are made under the clause. Indeed, as has already been referenced, the workplace has dramatically evolved in the last decade with the introduction and growth of technology. Whether it be Royal Mail using the postal digital assistant service to calculate the length of time posties spend walking, on doorsteps and standing still, or Amazon collecting data from handheld scanners to calculate how much time workers are spending “off task”, the digital monitoring of workers and subsequent use of that data by managers to assess performance, allocate work hours and decide on levels of pay, is on the rise.
Of course it is absolutely right that workplaces embrace technology. As Andrew Pakes of Prospect said to this Committee, our economy and the jobs that people do each day can be made better and more productive through the good deployment of technology—but the key is in the phrase “good deployment”, and in order to have deployment that works for the greater good, the rights and protections in place at work must keep pace with the changing nature of the workplace and these technological advancements. As Labour outlined in our industrial strategy, we want to do just that: harness data for the public good and ensure that data and the innovation it brings with it benefit our wider society, not just large corporations. Further, as is written in our “New Deal for Working People”, Labour wants to introduce new rights to protect workers in the modern age—for example by legislating to make proposals to introduce surveillance technologies subject to consultation and agreement of trade unions, or elected staff representatives where there is no trade union. After all, we can only truly unlock the benefits of data and become a world leader in this space if there is genuine public trust in these technologies. Good regulation breeds that trust.
Currently, however, and particularly in the Bill, the kinds of measures that would allow for good deployment of technology in the workplace—technology that operates in the greater interest including that of workers—are missing from the Government’s plans. Instead, as the TUC note, we are overseeing a growing power imbalance between worker and employer. This imbalance not only exists by the nature of the relationship, but it is now being exacerbated by the increasing level of knowledge and control that employers have over personal data as the workplace becomes digitised, compared with workers, who have very little power over, expertise on or access to such data.
Some impressive projects have sought to address that imbalance. For example, in 2020 Prospect worked with a coalition of unions, tech specialists and researchers to launch a beta version of WeClock, a free mobile app that helps workers to track and manage their own data such as that related to their location, their commute and when they are doing work on their phone. Those data profiles could then potentially be used by trade union campaigners to improve rights for workers. However, it should not just be down to individual projects to ensure that there is an equal balance between worker and employer. The Bill is a huge missed opportunity to write into law this balance and the principles that we should consider with regard to worker’s rights in the modern age.
The amendment, which has been prepared in partnership with the Institute for the Future of Work, is designed to right that wrong and ensure that where regulations are made about automated decision making, the full impact on workers is considered and strong principles about worker involvement are upheld. It will mean that the Secretary of State has to consider that people have an inclusive digital environment at work, that they should be protected from harms by algorithmic systems, and that they should be meaningfully consulted before and after the use of such tools. Further, under this amendment, consideration will be given to supporting workers in building the information, literacy and skills needed to understand these transitions in the workplace, thereby addressing some of the imbalances in knowledge and understanding.
I will end with an example of the real-life consequences of employment and data laws lagging behind technology. As was revealed by a report by the Worker Info Exchange just last month, 11 Just Eat couriers in the UK were recently robotically fired after receiving allegations of fraudulent activity identified by an automated system. According to the report, these workers were falsely accused of receiving “undeserved financial gain” relating to nominal waiting time payments at restaurants. Just Eat argued that the workers left the restaurant while continuing to claim waiting fees. However, GPS evidence showed that workers had stayed in the vicinity of the restaurant, usually in the car park. In each case, the worker collected the food and completed the delivery, and the average value of the alleged undeserved payments justifying the robo-firings was just £1.44. Cases such as those, in which real livelihoods are impacted and rights infringed for the sake of profit margins, can and must be avoided.
The amendment would take the first steps in ensuring that regulations around automated decision making centre the unique experience of workers. It also highlights the Bill’s failure to move towards a legislative framework in which a distinct focus is placed on harnessing data for the public good, which is something that Labour would have placed at the heart of a data Bill such as this one.
I rise to speak briefly in support of the amendment tabled by my hon. Friend the Member for Barnsley East and to emphasise the points that she made regarding the importance of putting forward a vision for the protection of workers as the nature of working environments change. That is part of what the amendment’s “digital information principles at work” seek to do. I declare an interest: I worked for Ofcom as head of technology before coming to this House. That work highlighted to me the importance of forward-looking regulation. As my hon. Friend set out, artificial intelligence is not forward looking; it is here with us and in the workplace.
Many technological changes have made work more accessible to more people: covid showed us that we could work from many different locations—indeed, Parliament successfully worked from many locations across the country. Technological changes have also made work more productive, and companies and public sector organisations are taking advantage of that increase in productivity. But some technologies have accelerated bad employment practices, driven down standards and damaged the wellbeing of workers—for example, workplace surveillance technologies such as GPS tracking, webcam monitoring and click monitoring, which encroach on workers’ privacy and autonomy. My constituents often say that they feel that technology is something that is done to them, rather than something that has their consent and empowers them.
It is important, as I am sure that the Minister will agree, that working people welcome and embrace the opportunities that technology can bring, both for them and for the companies and organisations they work for, but that cannot happen without trust in those technologies. For that, there need to be appropriate regulation and safeguards. Surely the Minister must therefore agree that it is time to bring forward a suite of appropriate principles that follows amendment’s principle of
“a fair, inclusive and trustworthy digital environment at work.”
I hope that he cannot disagree with any of that.
If we are to get ourselves out of the economic stagnation and lack of growth of the last 10 or 13 years, we need to build on new technologies and productivity, but we cannot do that without the support and trust of people in the workforce. People must feel that their rights—new rights that reflect the new environment in the workplace—are safeguarded. I hope that the Minister will agree that the principles set out in the amendment are essential to building that trust, and to ensuring a working environment in which workers feel protected and able to benefit from advances in technology.
I am grateful to the hon. Members for Barnsley East and for Newcastle upon Tyne Central for setting out the thinking behind the amendment. We share the view, as the hon. Member for Newcastle upon Tyne Central has just said, that those who are subject to artificial intelligence and automated decision making need to have trust in the process, and there need to be principles underlying the way in which those decisions are taken. In each case, the contributions go above and beyond the provision in the Bill. On what we are proposing regarding data protection, the changes proposed in clause 11 will reinforce and provide further clarification, as I have said, in respect of the important safeguards for automated decision making, which may be used in some workplace technologies. These safeguards ensure that individuals are made aware of and can seek human intervention on significant decisions that are taken about them through solely automated means. The reforms to article 22 would make clear employer obligations and employee rights in such scenarios, as we debated in the earlier amendments.
On the wider question, we absolutely recognise that the kind of deployment of technology in the workplace shown in the examples that have already been given needs to be considered across a wide range of different regulatory frameworks in terms of not just data protection law, but human rights law, legal frameworks regarding health and safety and, of course, employment law.
I thank the Minister for his comments. I note that he castigates us, albeit gently, for tabling an amendment to this data protection Bill, while he argues that there is a need for wider legislation to enshrine the rights he apparently agrees with. When and where will that legislation come forward? Does he recognise that we waited a long time and listened to similar arguments about addressing online harms, but have ended up in a situation where—in 2023—we still do not have legislation on online harms? My question is: if not now, when?
As I was Chair of the Culture, Media and Sport Committee in 2008 when we published a report calling for legislation on online safety, I recognise the hon. Lady’s point that these things take a long time—indeed, far too long—to come about. She calls for action now on governance and regulation of the use of artificial intelligence. She will know that last month the Government published the AI regulation White Paper, which set out the proposals for a proportionate outcomes-focused approach with a set of principles that she would recognise and welcome. They include fairness, transparency and explainability, and we feel that this has the potential to address the risks of possible bias and discrimination that concern us all. As she knows, the White Paper is currently out to consultation, and I hope that she and others will take advantage of that to respond. They will have until 21 June to do so.
I assure the hon. Lady and the hon. Member for Barnsley East that the Government are keenly aware of the need to move swiftly, but we want to do so in consultation with all those affected. The Bill looks at one relatively narrow aspect of the use of AI, but certainly the Government’s general approach is one that we are developing at pace, and we will obviously respond once the consultation has been completed.
The power imbalance between employer and worker has no doubt grown wider as technology has developed. Our amendment speaks to the real-life consequences of that, and to what happens when employment and data law lags behind technology. For the reasons that have been outlined by my hon. Friend the Member for Newcastle upon Tyne Central and myself, I would like to continue with my amendment.
Question put, That the amendment be made.
We have, I think, covered a lot of ground already in the debates on the amendments. To recap, clause 11 reforms the rules relating to automated decision making in article 22 of the UK GDP and relevant sections of the Data Protection Act 2018. It expands the lawful grounds on which solely automated decision making that produces a legal or similarly significant effect on an individual may be carried out.
Currently, article 22 of the UK GDPR restricts such activity to a narrow set of circumstances. By expanding the available lawful grounds and ensuring we are clear about the required safeguards, these reforms will boost confidence that the responsible use of this technology is lawful, and will reduce barriers to responsible data use.
The clause makes it clear that solely automated decisions are those that do not involve any meaningful human involvement. It ensures that there are appropriate constraints on the use of sensitive personal data for solely automated decisions, and that such activities are carried out in a fair and transparent manner, providing individuals with key safeguards.
The clause provides three powers to the Secretary of State. The first enables the Secretary of State to describe cases where there is or is not meaningful human involvement in the taking of a decision. The second enables the Secretary of State to further describe what is and is not to be taken as having a significant effect on an individual. The third enables the introduction of further safeguards, and allows those already set out in the reforms to be amended but not removed.
The reformed section 50 of the Data Protection Act mirrors the changes in subsection (1) for solely automated decision making by law enforcement agencies for a law enforcement purpose, with a few differences. First, in contrast to article 22, the rules on automated decision making apply only where such decisions have an adverse legal or similarly significant effect on the individual. Secondly, the processing of sensitive personal data cannot be carried out for the purposes of entering into a contract with the data subject for law enforcement purposes.
The final difference relates to the safeguards for processing. This clause replicates the UK GDPR safeguards for law enforcement processing but also allows a controller to apply an exemption to them where it is necessary for a particular reason, such as to avoid obstructing an inquiry. This exemption is available only where the decision taken by automated means is reconsidered by a human as soon as reasonably practicable.
The subsections amending relevant sections of the Data Protection Act 2018, which apply to processing by or on behalf of the intelligence services, clarify that requirements apply to decisions that are entirely automated, rather than solely automated. They also define what constitutes a decision based on this processing. I have explained the provisions of the clause, and hope the Committee will feel able to accept it.
I talked at length about my views about the changes to automated decision making when we debated amendments 77, 120, 76, 75, 121 and 122. I have nothing further to add at this stage, but those concerns still stand. As such, I cannot support this clause.
Question put, That the clause stand part of the Bill.
I beg to move amendment 17, in schedule 3, page 140, line 9, leave out sub-paragraph (3) and insert—
“(3) In paragraph 2—
(a) for “under Articles 15 to 22”, in the first place, substitute “arising under or by virtue of Articles 15 to 22D”, and
(b) for “his or her rights under Articles 15 to 22” substitute “those rights”.”.
This amendment adjusts consequential amendments of Article 12(2) of the UK GDPR for consistency with other amendments of the UK GDPR consequential on the insertion of new Articles 22A to 22D.
With this it will be convenient to discuss the following:
Government amendments 18 to 23.
That schedule 3 be the Third schedule to the Bill.
I can be reasonably brief on these amendments. Schedule 3 sets out the consequential changes needed to reflect references to the rules on automated decision making in reformed article 22 and section 50 and other provisions in the UK GDPR and the Data Protection Act 2018. Schedule 3 also sets out that section 14 of the Data Protection Act is repealed. Instead, reformed article 22 sets out the safeguards that must apply, regardless of the lawful ground on which such activity is carried out.
Government amendments 17 to 23 are minor technical amendments ensuring that references elsewhere in the UK GDPR and the Data Protection Act to the provisions on automated decision making are comprehensively updated to reflect the reforms related to such activity in this Bill. That means that references to article 22 UK GDPR are updated to the reformed article 22A to 22D provisions, and references to sections 49 and 50 in the Data Protection Act are updated to the appropriate new sections 50A to 50D.
I thank the Minister for outlining these technical changes. I have nothing further to add on these consequential amendments beyond what has already been discussed on clause 11 and the rules around automated decision making. Consistency across the statute book is important, but all the concerns I raised when discussing the substance of those changes remain.
Amendment 17 agreed to.
Amendments made: 18, in schedule 3, page 140, line 30, before second “in” insert “provided for”.
This amendment and Amendment 19 adjust consequential amendments of Article 23(1) of the UK GDPR for consistency with other amendments of the UK GDPR consequential on the insertion of new Articles 22A to 22D.
Amendment 19, in schedule 3, page 140, line 31, leave out “in or under” and insert
“arising under or by virtue of”.
See the explanatory statement for Amendment 18.
Amendment 20, in schedule 3, page 140, line 33, leave out from “protection” to end of line 35 and insert
“in accordance with, and with regulations made under, Articles 22A to 22D in connection with decisions based solely on automated processing (including decisions reached by means of profiling)”.
This amendment adjusts the consequential amendment of Article 47(2)(e) of the UK GDPR to reflect the way in which profiling is required to be taken into account for the purposes of provisions about automated decision-making (see Article 22A(2) inserted by clause 11).
Amendment 21, in schedule 3, page 140, line 36, leave out paragraph 10 and insert—
“10 In Article 83(5) (general conditions for imposing administrative fines)—
(a) in point (b), for “22” substitute “21”, and
(b) after that point insert—
“(ba) Article 22B or 22C (restrictions on, and safeguards for, automated decision-making);””.
This amendment adjusts the consequential amendment of Art 83(5) of the UK GDPR (maximum amount of penalty) for consistency with the consequential amendment of equivalent provision in section 157(2) of the Data Protection Act 2018.
Amendment 22, in schedule 3, page 141, line 8, leave out sub-paragraph (2) and insert—
“(2) In subsection (3), for “by the data subject under section 45, 46, 47 or 50” substitute “made by the data subject under or by virtue of any of sections 45, 46, 47, 50C or 50D”.”.
This amendment adjusts the consequential amendment of section 52(3) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.
Amendment 23, in schedule 3, page 141, line 9, leave out sub-paragraph (3) and insert—
“(3) In subsection (6), for “under sections 45 to 50” substitute “arising under or by virtue of sections 45 to 50D””.—(Sir John Whittingdale.)
This amendment adjusts the consequential amendment of section 52(6) of the Data Protection Act 2018 for consistency with other amendments of that Act consequential on the insertion of new sections 50A to 50D.
Schedule 3, as amended, agreed to.
Clause 12
General obligations
Question proposed, That the clause stand part of the Bill.
One of the main criticisms that the Government have received of the current legislative framework is that it sets out a number of prescriptive requirements that organisations must satisfy to demonstrate compliance. They include appointing independent data protection officers, keeping records of processing, appointing UK representatives, carrying out impact assessments and consulting the ICO about intended processing activities in specified circumstances.
Those rules can sometimes generate a significant and disproportionate administrative burden, particularly for small and medium-sized enterprises and for some third sector organisations. The current framework provides some limited exemptions for small businesses and organisations that are carrying out low-risk processing activities, but they are not always as clear or as useful as they should be.
We are therefore taking the opportunity to improve chapter 4 of the UK GDPR, and the equivalent provisions in part 3 of the Data Protection Act, in respect of law enforcement processing. Those provisions deal with the policies and procedures that organisations and law enforcement organisations must put in place to monitor and ensure compliance. Clauses 12 to 20 will give organisations greater flexibility to implement data protection management programmes that work for their organisations, while maintaining high standards of data protection for individuals.
Clause 12 is technical in nature. It will improve the terminology in the relevant articles of the UK GDPR by replacing the requirement to implement
“appropriate technical and organisational measures”.
In its place, data protection risks must be managed with
“appropriate measures, including technical and organisational measures,”.
That will give organisations greater flexibility to implement any measures that they consider appropriate to help them manage risks. A similar clarification is made to equivalent parts of the Data Protection Act.
Clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens—
Order. I am sorry, Minister, but we are talking about clause 12 at the moment; we will come on to clause 13 later. Have you concluded your remarks on clause 12?
I think I have covered the points that I would like to make on clause 12.
Clause 12 is a set of largely technical amendments to terminology that I hope will provide clarity to data controllers and processors. I have no further comments to make at this stage.
Question put and agreed to.
Clause 12 accordingly ordered to stand part of the Bill.
Clause 13
Removal of requirement for representatives for controllers etc outside the UK
Question proposed, That the clause stand part of the Bill.
As I was saying, clause 13 will remove article 27 of the UK GDPR, ending the requirement for overseas controllers or processors to appoint a representative in the UK where they offer goods or services to, or monitor the behaviour of, UK citizens. By no longer mandating organisations to appoint a representative, we will be allowing organisations to decide for themselves the best way to comply with the requirements for effective communication. That may still include the appointment of a UK-based representative. The removal of this requirement is therefore in line with the Bill’s wider strategic aim of removing unnecessary prescriptive regulation.
The rules set out in the UK GDPR apply to all those who are active in the UK market, regardless of whether their organisation is based or located in the UK. Article 27 of the UK GDPR currently requires controllers and processors based outside the UK to designate a UK-based representative, unless they process only occasionally without special categories of data, providing an element of proportionality, or are a public authority or body. The idea is that the representative will act on behalf of the controller or processor regarding their UK GDPR compliance and will deal with the ICO and data subjects in that respect, acting as a primary contact for all things data within the country.
The removal of the requirement for a UK representative was not included in the Government’s consultation, “Data: a new direction”, nor was it even mentioned in their response. As a result, stakeholders have not been given an opportunity to put forward their opinions on this change. I wish to represent some of those opinions so that they are on the record for the Minister and his Department to consider.
Concern among the likes of Lexology, DataRep and Which? relates primarily to the fact that the current requirements for UK-based representatives ensure that UK data subjects can conveniently reach the companies that process their personal data, so that they can exercise their rights under the GDPR. Overseas data handlers may have a different first language, operate in a different time zone or have local methods of contact that are not easily accessible from the UK. Having a UK-based point of contact therefore ensures that data subjects do not struggle to apply the rights to which they are entitled because of the inevitable differences that occur across international borders.
As Lexology has pointed out, the Government’s own impact assessment says:
“There is limited information and data on the benefits of having an Article 27 representative as it is a relatively new and untested requirement and also one that applies exclusively to businesses and organisations outside of the UK which makes gathering evidence very challenging.”
By their own admission, then, the Government seem to recognise the challenges in gathering information from organisations outside the UK. If the Government find it difficult to get the information that they require, surely average citizens and data subjects may also face difficulties.
Not only is having a point of contact a direct benefit for data subjects, but a good UK representative indirectly helps data subjects by facilitating a culture of good data protection practice in the organisation that they represent. For example, they may be able to translate complex legal concepts into practical business terms or train fellow employees in a general understanding of the UK GDPR. Such functions may make it less likely that a data subject will need to exercise their rights in the first place.
As well as things being harder for data subjects in the ways I have outlined, stakeholders are not clear about the benefits of removing representatives for UK businesses. For example, the Government impact assessment estimates that the change could save a large organisation £50,000 per year, but stakeholders have said that that figure is an overestimation. Even if the figure is accurate, the saving will apply only to organisations outside the UK and will be made through a loss of employment for those who are actually based in the UK and performing the job.
The question therefore remains: if the clause is not in the interests of data subjects, of UK businesses or of UK-based employees who act as representatives, how will this country actually benefit from the change? I am keen to hear from the Minister on that point.
If there are concerns that were not fed in during the consultation period, obviously we will consider them. However, it remains the case that even without the article 27 representative requirement, controllers will have to maintain contact with UK citizens and co-operate with the ICO under other provisions of the UK GDPR. For example, overseas controllers and processors must still co-operate with the ICO as a result of the specific requirements to do so under article 31 of the UK GDPR. To answer the hon. Lady’s question about where the benefit lies, the clause is part of a streamlining process to remove what we see as unnecessary administrative requirements and bureaucracy.
Question put and agreed to.
Clause 13 accordingly ordered to stand part of the Bill.
Clause 14
Senior responsible individual
Question proposed, That the clause stand part of the Bill.
As I mentioned in our debate on clause 12, clauses 12 to 18 will give organisations greater flexibility about the policies, procedures or programmes that they put in place to ensure compliance with the legislation. As we have discussed, a criticism of the current legal framework is that many of the existing requirements are so prescriptive that they impose unnecessary burdens on businesses. Many organisations could manage data protection risks effectively without appointing an independent data protection officer, but they are forced to do so by the prescriptive rules that we inherited from the European Union.
Clause 14 will therefore abolish existing requirements on data protection officers and replace them with new requirements for organisations to designate a senior responsible individual where appropriate. That individual would be part of the organisation’s senior management and would be responsible for overseeing data protection matters within the organisation. In particular, the individual would be responsible for monitoring compliance with the legislation, ensuring the implementation of appropriate risk management procedures, responding to data protection breaches and co-operating with the information commissioner, or for ensuring that those tasks are performed by another suitably skilled person where appropriate. Senior responsible individuals may perform the tasks specified in clause 14 themselves, delegate them to suitably skilled members of staff or, if it is right for the company and its clients, seek advice from independent data protection experts.
We recognise that some people have raised concerns that giving organisations more flexibility in how they monitor and ensure compliance with the legislation could reduce standards of protection for individuals. We are confident that that will not be the effect of the clause. On the contrary, the clause provides an opportunity to elevate discussions about data protection risks to senior levels within organisations by requiring a senior responsible individual to take ownership of data protection risks and embed a culture of data protection. On that basis, I commend the clause to the Committee.
In a number of places in the Bill, the Government have focused on trying to ensure a more proportionate approach to data protection. That often takes the form of reducing regulatory requirements on controllers and processors where low-risk processing, which presents less of a threat of harm to data subjects, is taking place. Clause 14 is one place in which Ministers have applied that principle, replacing data protection officers with a requirement to appoint a senior responsible individual, but only where high-risk processing is being carried out.
Such a proportionate approach makes sense in theory. Where the stakes are lower, less formalised oversight of GDPR compliance will be required, which will be particularly helpful in small business settings where margins and resources are tight. Where the stakes are higher, however, a senior responsible individual will have a similar duty to that of a data protection officer, but with the added benefit of being part of the senior leadership team, ensuring that data protection is considered at the highest level of organisations conducting high-risk processing.
However, the Government have admitted that the majority of respondents to their consultation disagreed with the proposal to remove the requirement to designate a data protection officer. In particular, respondents were concerned that removing DPOs would result in
“a loss of data protection expertise”
and
“a potential fall in trust and reassurance to data subjects.”
Indeed, data protection officers perform a vital role in upholding GDPR, taking on responsibility for informing people of their obligations; monitoring compliance, including raising awareness and training staff; providing advice, where requested, on data protection impact assessments; co-operating with the regulator; and acting as a contact point. That provides not only guaranteed expertise to organisations, but reassurance to data subjects that they will have someone to approach should they feel the need to exercise any of their rights under the GDPR.
The contradiction between the theory of the benefits of proportionality and the reality of the concerns expressed by respondents to the consultation emphasises a point that the Government have repeatedly forgotten throughout the Bill: although removing truly unnecessary burdens can sometimes be positive, organisations often want clear regulation more than they want less regulation. They believe in the principles of the GDPR, understand the value of rights to data subjects and often over-comply with regulation out of fear of breaking the rules.
In this context, it makes sense that organisations recognise the value of having a data protection officer. They actually want in-house expertise on data—someone they can ask questions and someone they can rely on to ensure their compliance. Indeed, according to the DPO Centre, in September 2022, the UK data protection index panel of 523 DPOs unequivocally disagreed with the idea that the changes made by the clause would be in the best interests of data subjects. Furthermore, when asked whether the proposal to remove the requirement for a DPO and replace it with a requirement for a senior responsible individual would simplify the management of privacy in their organisation, 42% of DPOs surveyed gave the lowest score of 1.
Did the Department consider offering clarification, support and guidance to DPOs, rather than just removing them? Has it attempted to assess the impact of their removal on data subjects? In practice, it is likely that many data protection officers will be rebranded as senior responsible individuals. However, many will be relieved of their duties, particularly since the requirement to be part of the organisation’s senior management team could be problematic for external DPO appointments and those in more junior positions. Has the Department assessed how many data protection officers may lose their job as a result of these changes? Is the number expected to be substantial? Will there be any protections to support those people in transitioning to skilled employment surrounding data protection and to prevent an overall reduction of data protection expertise in organisations?
The clause does not in any way represent a lessening of the requirement on organisations to comply with data protection law. It simply introduces a degree of flexibility. An organisation could not get rid of data protection officers without ensuring that processing activities likely to pose high risks to individuals are still managed properly. The senior responsible individual will be required to ensure that that is the case.
At the moment, even small firms whose core activities do not involve the processing of sensitive data must have a data protection officer. We feel that that is an unnecessary burden on those small firms, and that allowing them to designate an individual will give them more flexibility without reducing the overall level of data protection that they require.
Question put and agreed to.
Clause 14 accordingly ordered to stand part of the Bill.
Clause 15
Duty to keep records
Question proposed, That the clause stand part of the Bill.
Clauses 15 and 16 will improve the record-keeping requirements under article 30 of the UK GDPR and the logging requirements under part 3 of the Data Protection Act, which is concerned with records kept for law enforcement purposes. Article 30 of the UK GDPR requires most organisations to keep records of their processing activities and includes a list of requirements that should be included in the record. Those requirements can add to the paperwork that organisations have to keep to demonstrate compliance. Although there is an exemption from those requirements in the UK GDPR for some small organisations, it has a limited impact because it applies only where their processing of personal data is “occasional”.
Clause 15 will replace the record-keeping requirements under article 30. It will make it easier for data controllers to understand exactly what needs to be included in the record. Most importantly, organisations of any size will no longer have to keep records of processing, unless their activities are
“likely to result in a high risk”
to individuals. That should help small businesses in particular, which have found the current small business exemption difficult to understand and apply in practice.
Clause 16 will make an important change to the logging requirements for law enforcement purposes in part 3 of the Data Protection Act. It will remove the ineffective requirement to record a justification when an officer consults or discloses personal data for the purposes of an investigation. The logging requirements are unique to the law enforcement regime and aim to assist in monitoring and auditing data use. Recording a justification for accessing data was intended to help protect against unlawful access, but the reality is that someone is unlikely to record an honest reason if their access is unlawful. That undermines the purpose of this requirement, because appropriate and inappropriate uses would both produce essentially indistinguishable data.
As officers often need to access large amounts of data quickly, especially in time-critical scenarios, the clause will facilitate the police’s ability to investigate and prevent crime more swiftly. We estimate that the change could save approximately 1.5 million policing hours. Other elements of the logs, such as the date and time of the consultation or disclosure and the identity of the person accessing them, are likely to be far more effective in protecting personal data against misuse; those elements remain in place. On that basis, I commend the clauses to the Committee.
Record keeping is a valuable part of data processing. It requires controllers, and to a lesser extent processors, to stay on top of all the processing that they are conducting by ensuring that they record the purposes for processing, the time limits within which they envisage holding data and the categories of recipients to whom the data has been or will be disclosed.
Many respondents to the Government’s consultation “Data: a new direction” said that they did not think the current requirements were burdensome. In fact, they said that the records allow them easily to understand the personal data that they are processing and how sensitive it is. It is likely that that was helped by the fact that the requirements were proportionate, meaning that organisations that employed under 250 people and were not conducting high-risk processing were exempt from the obligations.
It is therefore pleasing to see the Government rolling back on the idea of removing record-keeping requirements entirely, as was suggested in their consultation. As was noted, the majority of respondents disagreed with that proposal, and it is right that it has been changed. However, some respondents indicated a preference for more flexibility in the record-keeping regime, which is what I understand the clause is trying to achieve. Replacing the current requirements with a requirement to keep an appropriate record of processing, tied to high-risk activities, will give controllers the flexibility that they require.
As with many areas of the Bill, it is important that we be clear on the definition of “appropriate” so that it cannot be used by those who simply do not want to keep records. I therefore ask the Minister whether further guidance will be available to assist controllers in deciding what counts as appropriate.
I also wish to highlight the point that although in isolation the clause does not seem to change requirements much, other than by adding an element of proportionality, it cannot be viewed in isolation. In combination with other provisions, such as the reduced requirements on DPIAs and the higher threshold for subject access requests, it seems that there will be less records overall on which a data subject might be able to rely to understand how their personal information is being used or to prove how it has been used when they seek redress. With that in mind, I ask the Minister whether the Government have assessed the potential impact of the combination of the Bill’s clauses on the ability of data subjects to exercise their rights. Do the Government have any plans to work with the commissioner to monitor any such impacts on data subjects after the Bill is passed?
I turn to clause 16. Section 62 of the Data Protection Act 2018 requires competent authorities to keep logs that show who has accessed certain datasets, and at what time. It also requires that that access be justified: the reason for consulting the data must be given. Justification logs exist to assist in disciplinary proceedings, for example if there is reason to believe that a dataset has been improperly accessed or that personal data has been disclosed in an unauthorised way. However, as Aimee Reed, director of data at the Met police and chair of the national police data board, told the Committee:
“It is a big requirement across all 43 forces, largely because…we are operating on various aged systems. Many of the technology systems…do not have the capacity to log section 62 requirements, so police officers are having to record extra justification in spreadsheets alongside the searches”.––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 56, Q118.]
That creates what she described as a “considerable burden”.
Understandably, therefore, the Bill removes the justification requirement. There are some—the Public Law Project, for example—who have expressed concern that this change would pose a threat to individual rights by allowing the police to provide a retrospective justification for accessing records. However, as the explanatory notes indicate, it is highly unlikely that in an investigation concerning inappropriate use, a justification recorded by the individual under investigation for improper access or unauthorised access could be relied on anyway. Clause 16 would therefore not stop anyone from being investigated for improper access; it would simply reduce the burden of recording a self-identified justification that could hardly be relied on anyway. I welcome the intent of the clause and the positive impact that it could have on our law enforcement processing.
The intention behind clause 15 is to reduce the burden on organisations by tying the record-keeping requirements to high-risk processing activities. If there is uncertainty about the nature of the risk, organisations will be able to refer to ICO guidance. The ICO has already published examples on its website of processing that is likely to be high-risk for the purposes of completing impact assessments; clause 17 will require it to apply the guidance to the new record-keeping requirements as well. It will continue to provide guidance on the matter, and we are happy to work with it on that.
With respect to clause 16, I am most grateful for the Opposition’s welcome recognition of the benefits for crime prevention and law enforcement.
Question put and agreed to.
Clause 15 accordingly ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Clause 17
Assessment of high risk processing
I beg to move amendment 102, in clause 17, page 32, line 12, leave out from “with” to the end of line 28 on page 33 and insert
“subsection (2)
(2) In Article 57(1) (Information Commissioner’s tasks), for paragraph (k) substitute—
‘(k) produce and publish a document containing examples of types of processing which the Commissioner considers are likely to result in a high risk to the rights and freedoms of individuals (for the purposes of Articles 27A, 30A and 35);’.”
This amendment would remove the provisions of clause 17 which replace the existing data protection impact assessment requirements with new requirements about “high risk processing”, leaving only the requirement for the ICO to produce a document containing examples of types of processing likely to result in a high risk to the rights and freedoms of individuals.
With this it will be convenient to discuss the following:
Amendment 103, in clause 17, page 33, line 9, at end insert—
“(4A) After Article 35(11) insert—
‘(11A) Any public authority, government department, or contractor of a government department which routinely uses public data in the discharge of its functions must publish any assessments of high risk processing conducted pursuant to this Article. Any assessments published under this Article must be redacted where necessary for the purposes of—
(a) removing sensitive details,
(b) protecting public interests, or
(c) ensuring the security of data processing operations.’”
This amendment inserts a new requirement into Article 35 of UKGDPR, for any public authority which uses public data to publish any assessment of high risk processing they conduct under Article 35.
Clause stand part.
Clause 18 stand part.
As was the intention, the Bill loosens restrictions on processing personal data in many areas: it adds a new lawful basis and creates new exceptions to purpose limitation, removes blocks to automated decision-making and allows for much thinner record keeping. Each change in isolation may make only a relatively small adjustment to the regime. Collectively, however, they result in a large-scale shift towards controllers being able to conduct more processing, with less transparency and communication, and having fewer records to keep, all of which reduces opportunities for accountability.
As mentioned, loosening restrictions is an entirely deliberate consequence of a Bill that seeks to unlock innovation through data—an aim that Members across the House, including me, are strongly behind, given the power of data to influence growth for the public good. However, given the cumulative impact of this deregulation, where increasingly opaque processing is likely to result in a large risk to people’s rights, a processor might at the very least record how they will ensure that any high-risk activities that they undertake do not lead to unlawful or discriminatory outcomes for the general public. That is exactly what the current system of DPIAs, as outlined in article 35 of GDPR, allows for. These assessments, which require processors to measure their activities against the risk to the rights and freedoms of data subjects, are not just a tick-box exercise, unnecessary paperwork or an administrative burden; they are an essential tool for ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to a fundamental breach of their rights.
Assessments of that kind are not a concept unique to data processing. The Government routinely publish impact assessments on the legislation that they want to introduce; any researcher or scientist is likely to conduct an assessment of the safety and morality of their methodology; and a teacher will routinely and formally measure the risks involved when taking pupils on a school trip. Where activities pose a high risk to others, it is simply common practice to keep a record of where the risks lie, and to make plans to ensure that they are mitigated where possible.
In the case of data, not only are DPIAs an important mechanism to ensure that risks are managed, but they act as a key tool for data subjects. That is first because the process of conducting a DPIA encourages processors to consult data subjects, either directly or through a representative, on how the type of processing might impact them. Secondly, where things go wrong for data subjects, DPIAs act as a legal record of the processing, its purpose and the risks involved. Indeed, the Public Law Project, a registered charity that employs a specialist lawyer to conduct research, provide training and take on legal casework, identified DPIAs as a key tool in litigating against the unlawful use of data processing. They show a public law record of the type of processing that has been conducted, and its impact.
The TUC and the Institute for the Future of Work echo that, citing DPIAs as a crucial process and consultation tool for workers and trade unions in relation to the use of technology at work. The clause, however, seeks to water down DPIAs, which will become “assessments of high-risk processing”. That guts both the fundamental benefit of risk management that they offer in a data protection system that is about to become increasingly transparent, and the extra benefits that they give to data subjects.
Instead of requiring a systematic description of the processing operations and purposes, under the new assessments the controller would be required only to summarise the purpose of the processing. Furthermore, instead of conducting a proportionality assessment, controllers will be required only to consider whether the processing is necessary for the stated purpose. The Public Law Project describes the proportionality assessment as a crucial legal test that weighs up whether an infringement of human rights, including the right not to be discriminated against, is justified in relation to the processing being conducted.
When it comes to consultation, where previously it was encouraged for controllers to seek the views of those likely to be impacted by the processing, that requirement to seek those views will now be entirely omitted, despite the important benefit to data subjects, workers and communities. The new tests therefore simply do not carry the same weight or benefit as DPIAs, which in truth could themselves be strengthened. It is simply not appropriate to remove the need to properly assess the risk of processing, while simultaneously removing restrictions that help to mitigate those risks. For that reason, the clause must be opposed; we would keep only the requirement for the ICO to produce that much-needed guidance on what constitutes high-risk processing.
Moving on to amendment 103, given the inherent importance of conducting risk assessments for high-risk processing, and their potential for use by data subjects when things go wrong, it seems only right that transparency be built into the system where it comes to Government use of public data. The amendment would do just that, and only that. It would not adjust any of the requirements on Government Departments or public authorities to complete high-risk assessments; it would simply require an assessment to be published in any case where one is completed. Indeed, the ICO guidance on DPIAs says:
“Although publishing a DPIA is not a requirement of UK GDPR, you should actively consider the benefits of publication. As well as demonstrating compliance, publication can help engender trust and confidence. We would therefore recommend that you publish your DPIAs, where possible, removing sensitive details if necessary.”
However, very few organisations choose to publish their assessments. This is a chance for the Government to lead by example, and foster an environment of trust and confidence in data protection
Alongside the amendment I tabled on compulsory reporting on the use of algorithms, this amendment is designed to afford the general public honesty and openness on how their data is used, especially where the process has been identified as having a high risk of causing harm. Again, a published impact assessment would provide citizens with an official record of high-risk uses of their data, should they need that when seeking redress. However, a published impact assessment would also encourage responsible use of data, so that redress does not need to be sought in the first place.
The Government need not worry about the consequences of the amendment if they already meet the requirement to conduct the correct impact assessments and process them in such a way that the benefits are not heavily outweighed by a risk to data rights. If rules are being followed, the amendment will only provide proof of that. However, if anyone using public data in a public authority’s name did so without completing the appropriate assessments, or processed that data in a reckless or malicious way, there would be proof of that. Where there is transparency, there is accountability, and where the Government are involved, accountability is always crucial in a democracy. The amendment would ensure that accountability shined through in data protection law.
Finally, I turn to clause 18. The majority of respondents to the “Data: a new direction” consultation agreed that organisations are likely to approach the ICO voluntarily before commencing high-risk processing activities if that is taken into account as a mitigating factor in any future investigation or enforcement action. The loosening of requirements in the clause is therefore not a major concern. However, when that is combined with the watering down of the impact assessments, there remains an overarching concern about the oversight of high-risk processing. I refer to my remarks on clause 17, in which I set out the broader problems that the Bill poses to protection against harms from high-risk processing.
As we have discussed, one of the principal objectives of this part of the Bill is to remove some of the prescriptive unnecessary requirements on organisations to do things to demonstrate compliance. Clauses 17 and 18 reduce the unnecessary burdens placed on organisations by articles 35 and 36 of the UK GDPR in respect of data protection impact assessments and prior consultation with the ICO respectively.
Clause 17 will replace the EU-derived notion of a data protection impact assessment with more streamline requirements for organisations to document how they intend to assess and mitigate risks associated with high-risk processing operations. The changes will apply to both the impact assessment provisions under the UK GDPR and the section of the Data Protection Act 2018 that deals with impact assessments for processing relating to law enforcement. Amendment 102 would reverse those changes to maintain the current data protection impact assessment requirements, but we feel that this would miss an important opportunity for reform.
There are significant differences between the new provisions in the Bill and current provisions on data protection impact assessments. First, the new provisions are less prescriptive about the precise processing activities for which a risk assessment will be required. We think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation, taking account of any relevant guidance from the regulator.
Secondly, we have also removed the mandatory requirement to consult individuals about the intended processing activity as part of a risk-assessment process, as that imposes unnecessary burdens. There are already requirements in the legislation to ensure that any new processing is fair, transparent and designed with the data protection principles in mind. It should be open to businesses to consult their clients about intended new processing operations if they wish, but that should not be dictated to them by the data protection legislation.
Clause 18 will make optional the previous requirement for data controllers to consult the commissioner when a risk assessment indicates a potential high risk to individuals. The Information Commissioner will be able to consider any voluntary actions that organisations have taken to consult the ICO as a factor when imposing administrative fines on a data controller. Currently, compliance with the prior consultation requirement is low, likely due to a lack of clarity in the legislation and a reluctance for organisations to engage directly with the regulator on potential high-risk processing. The clause will encourage a more proactive, open and collaborative dialogue between the ICO and organisations, so that they can work together to better mitigate the risks.
The Opposition’s amendment 103 would mandate the publication of risk assessments by all public sector bodies. That requirement would, in our view, place a disproportionate burden on public authorities of all sizes. It would apply not just to Departments but to smaller public authorities such as schools, hospitals, independent pharmacies and so on. The amendment acknowledges that each public authority would have to spend time redacting sensitive details from risk assessments prior to publication. As those assessments can already be requested by the ICO as part of its investigations, or by members of the public via freedom of information requests, we do not think it is necessary to impose that significant new burden on all public bodies. I therefore invite the hon. Member for Barnsley East to withdraw her two amendments, and I commend clauses 17 and 18 to the Committee.
I am happy not to press amendment 103 to a vote, but on amendment 102, I simply do not think it is appropriate to remove the need to properly assess the risk of processing while removing the restrictions that help to mitigate it. For those reasons, I will press it to a vote.
Question put, That the amendment be made.
I beg to move amendment 1, in clause 19, page 35, leave out lines 23 to 25 and insert—
“(5) The Commissioner must encourage expert public bodies to submit codes of conduct described in subsection (1) to the Commissioner in draft.”.
This amendment replaces a duty on expert public bodies to submit draft codes of conduct relating to compliance with Part 3 of the Data Protection Act 2018 to the Information Commissioner with a duty on the Information Commissioner to encourage such bodies to do so.
With this it will be convenient to discuss the following:
Government amendments 2 to 4.
Clause stand part.
Clause 19 introduces an ability for public bodies with the appropriate knowledge and expertise to produce codes of conduct applicable to the law enforcement regime. The clause mirrors the equivalent provision in the UK GDPR.
As with regular guidance, these codes of conduct will be drafted by law enforcement data protection experts and tailored to the specific data protection issues that affect law enforcement agencies, to help improve compliance with the legislation and encourage best practice. However, they are intended to carry more weight, because they will additionally have the formal approval of the Information Commissioner.
When a code of conduct is produced, there is a requirement to submit a draft of it to the Information Commissioner. While that is good practice, we think it is unnecessary to mandate that. Government amendment 1 replaces that requirement with a duty on the commissioner to instead encourage public bodies to do that. Government amendments 2 and 3 are consequential to that.
Where a public body has submitted a code of conduct to the commissioner for review, Government amendment 4 removes the requirement for the commissioner to review any subsequent amendments made by the public body until the initial draft has been considered. This change will promote transparency, greater clarity and confidence in how police process personal data under the law enforcement regime. Codes of conduct are not a new concept. The clause mirrors what is already available under the UK GDPR.
The Bill fails to fully recognise that the burdens that organisations face in complying with data protection legislation are not always best dealt with by simply removing the protections in place. In many cases, clarification and proper guidance can be just as fruitful in allowing data protection to work more seamlessly. Clauses such as clause 19, which seeks to create an environment in which best practice is shared on how to comply with data protection laws and deal with key data protection challenges, are therefore very welcome. It is absolutely right that we should capitalise on pockets of experience and expertise, especially in the public sector, where resources have often been stretched, particularly over the last 13 years. We should ensure that learnings are shared with those who are less familiar with how to resolve challenges around data.
It is also pleasing to see that codes that give sector-specific guidance will be approved by the commissioner before being published. That will ensure absolute coherence between guidance and the enforcement of data protection law more widely. I look forward to seeing what positive impact the codes of conduct will have on how personal data is handled by public bodies, to the benefit of the general public as well as the public bodies themselves; the burden on them will likely be lifted as a result of the clarity provided by the guidance.
I welcome the Opposition’s support.
Amendment 1 agreed to.
Amendments made: 2, in clause 19, page 35, line 26, leave out from ‘body’ to ‘, the’ in line 27 and insert ‘does so’.
This amendment is consequential on Amendment 1.
Amendment 3, in clause 19, page 35, line 28, leave out ‘draft’.
This amendment is consequential on Amendment 2.
Amendment 4, in clause 19, page 35, line 33, leave out from ‘conduct’ to the end of line 34 and insert—
‘that is for the time being approved under this section as they apply in relation to a code’.—(Sir John Whittingdale.)
This amendment makes clear that the Commissioner’s duty under new section 68A of the Data Protection Act 2018 to consider whether to approve amendments of codes of conduct relates only to amendments of codes that are for the time being approved under that section.
Clause 19, as amended, ordered to stand part of the Bill.
Clause 20
Obligations of controllers and processors: consequential amendments
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to consider the following:
Government amendments 42 and 43.
That schedule 4 be the Fourth schedule to the Bill.
Government amendments 40 and 41.
As clauses 12 to 18 remove terms such as data protection officers and data protection impact assessments from the legislation, some consequential changes are required to other parts of the legislation where the same terms are used. Clause 20 therefore introduces schedule 4, which sets out the details of the consequential changes required. An example of that is in article 13 of the UK GDPR, which currently requires controllers to provide individuals with the contact details of the data protection officer, where appropriate. In future, that provision will refer to the organisation’s senior responsible individual instead. Removal of the term data protection officer from the UK GDPR will have knock-on effects in other areas, including in relation to the types of people from whom the ICO receives requests and queries.
Government amendment 40 will provide that the commissioner may refuse to deal with vexatious or excessive requests made by any person, not just those made by data protection officers or data subjects. Government amendments 41 to 43 make further minor and technical changes to the provisions in schedule 4 to reflect the changes we have made to the terminology.
I have no comments to add on the consequential amendments in clause 20 beyond what has been discussed regarding the obligations on controllers and processors. With regard to Government amendments 40 to 44 and schedule 4, I will address changes to the ICO’s powers to refuse requests when we come to them further on in the Bill.
Question put and agreed to.
Clause 20 accordingly ordered to stand part of the Bill.
Schedule 4
Obligations of controllers and processors: consequential amendments
Amendments made: 42, in schedule 4, page 143, line 20, leave out ‘and section 135’.—(Sir John Whittingdale.)
This amendment is consequential on Amendment 40.
Amendment 43, in schedule 4, page 143, line 24, leave out paragraph 18.
This amendment is consequential on Amendment 40.
Schedule 4, as amended, agreed to.
Clause 21
Transfers of personal data to third countries and international organisations
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Amendment 104, in schedule 5, page 144, line 28, at end insert—
‘4 All provisions in this Chapter must be applied in such a way as to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.’
This amendment would reinsert into the new Article on general principles for international data transfers the principle that all provisions of this Chapter of the UK GDPR should be applied in such a way as to ensure that the level of protection of natural persons guaranteed by the Regulation is not undermined.
Government amendments 24 to 26.
That schedule 5 be the Fifth schedule to the Bill.
Government amendments 27 to 29.
That schedule 6 be the Sixth schedule to the Bill.
That schedule 7 be the Seventh schedule to the Bill.
Clause 21 refers to schedules 5 to 7, which introduce reforms to the provisions of the UK GDPR and the Data Protection Act 2018, which regulate the international transfers of personal data. Schedule 5 introduces changes to the UK’s general processing regime for transferring personal data internationally. In order to provide for a clearer structure than the current UK regime, schedule 5 will consolidate the existing provisions on international transfers. It replaces article 44 with article 44A, setting out in clearer terms the general principles for international transfers and listing the same bases under which personal data can be lawfully transferred overseas.
Schedule 5 also introduces article 45A, which sets out the Secretary of State’s power to make regulations approving transfers of personal data to a third country or international organisation. The Government now use the term “data bridges” to refer to those regulations, which allow the free flow of personal data. Article 45A outlines that the Secretary of State may make such regulations only if they are satisfied that the data protection test is met. In addition to the requirement that the Secretary of State be satisfied that the data protection test is met, article 45A specifies that the Secretary of State may have regard to other matters that he or she considers relevant when making those regulations, including the desirability of facilitating transfers of personal data to and from the UK.
Article 45B sets out the data protection test that the Secretary of State must consider is met in order to establish new data bridges. In order for a country or international organisation to meet the data protection test, the standard of protection for personal data in that country or international organisation must be “not materially lower” than the standard of protection under the UK’s data protection framework. The reformed law recognises that the Secretary of State must exercise their judgment when making a determination. Their assessment will be made with respect to the outcomes of data protection in a third country, instead of being prescriptive about the form and means of protection, recognising that no two data protection regimes are identical.
The article also sets out a more concise and streamlined list of key factors that the Secretary of State must consider as part of their assessment. However, article 45B(2) is a non-exhaustive list, and the Secretary of State may also need to consider other matters in order to determine whether the required standard of protection exists.
Article 45C amends the system for formally reviewing data bridge regulations, removing the requirement for them to be reviewed periodically. The Secretary of State will still be subject to the requirement to monitor developments in other countries on an ongoing basis. Schedule 5 also amends article 46, which sets out the rules for controllers and processors to make international transfers of personal data using alternative transfer mechanisms.
The new article 46 requirements are tailored for data exporters to transfer defined types of data in specific circumstances. They stipulate that the data exporter, acting reasonably and proportionately, must consider that the standard of protection provided for the data subject would be “not materially lower” than the standard of protection in the UK in the specific circumstances of the transfer. The new requirements accommodate disparities between data exporters, where what is right for a multinational organisation transferring lots of sensitive data may not be right for a small charity making ad hoc transfers.
Schedule 5 also introduces article 47A, which provides a power for the Secretary of State to create or recognise new UK and non-UK alternative transfer mechanisms. The new power will help to future-proof the UK’s international transfers regime by allowing the Government to shape international developments and react quickly to global trends, helping UK businesses connect and trade with their partners around the world.
Schedule 6 amends relevant parts of the Data Protection Act 2018 governing international transfers of personal data, which are governed by the law enforcement processing regime. Paragraph 4 omits the section governing transfers based on adequacy assessments and inserts a new provision to mirror the approach being adopted in schedule 5. As with the changes described in schedule 5, schedule 6 amends the power in new section 74AA for the Secretary of State to make regulations approving transfers of personal data to another jurisdiction. It replaces the current list of considerations with a broader, non-exhaustive one. The schedule also clarifies the test found in new section 74AB that must be applied when regulations are made, giving greater clarity to the UK regulations decision-making process.
The Minister is being very courteous and generous, and he makes a very sensible suggestion. Will he respond to amendment 104 after the Opposition have spoken to it?
It would make sense to explain the reasons why we are not convinced after we have heard the arguments in favour.
I am grateful to the Minister, and I will focus my remarks particularly on the contents of schedule 5 before explaining the thought process behind amendment 104.
In the globalised world in which we live, we have an obligation to be outward looking and to consider not just the activities that take place in the UK, but those that occur worldwide. When it comes to data protection, that means accepting that data will likely need to travel across borders, and inserting appropriate safeguards so that UK citizens do not lose the protection of data protection laws if their personal data is transferred away from this country. The standard of those safeguards is absolutely crucial to the integrity of our entire data protection regime. After all, if a controller can simply send the personal data of UK citizens to a country that has limited data protection laws for processing that would be unlawful here, and if they can transfer that data back afterwards, in reality our laws are only as strong as the country with the weakest protections in the world.
As things stand, there is only a limited set of circumstances under which personal data can be transferred to a third party outside the UK. One such circumstance is where there is an adequacy agreement, similar to that which we have with the EU. For such an agreement to be reached, the Secretary of State must have considered many things, including the receiver’s respect for human rights and data rules; the presence, or lack thereof, of a regulator, and its independence; and any international commitments they have made in relation to data protection. These amendments ensure that data can flow freely between the UK and another country as long as the level of protection received by citizens is not undermined by the regulatory structure in that country.
The Bill amends the adequacy-based framework and replaces it with a new outcomes-based approach through the data protection test. The test is met if the standard of the protection provided for data subjects, with regard to the general processing of personal data in the country or by the organisation, is not materially lower than the standard of protection under the UK GDPR and relevant parts of the DPA 2018.
When deciding whether the test is met, the Secretary of State must still consider many of the same things: their respect for human rights, the existence of a regulator, and international obligations. However, stakeholders such as Reset.tech and the TUC have expressed concern that the new test could mean that UK data is transferred to countries with lower standards of protection than previously. That is significant not just for data subjects in the UK, who may be faced with weaker rights, but for business, which fears that this may signify a divergence from the EU GDPR that could threaten the UK’s own adequacy status. Losing this agreement would have real-world consequences for UK consumers and businesses to the tune of hundreds of millions of pounds. What conversations has the Minister had with representatives of the European Commission to ensure that the new data protection test does not threaten adequacy? Does he expect the new data protection test to result in the data of UK citizens being passed to countries with weaker standards than are allowed under the current regime?
Moving on to amendment 104, one reason why some stakeholders are expressing concern about the new rules is because they appear to omit article 44. As it stands, for those who are concerned about the level of data protection available to them as a result of international transfers, article 44 of the UK GDPR provides a guarantee that the integrity of the UK’s data protection laws will be protected. Indeed, it sets out that all provisions relating to the international transfer of UK personal data
“shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.”
If UK data will not be transferred to countries with weaker protections, it is not clear why this simple guarantee would be removed. The amendment would clear up any confusion around that and reinsert the article so that data subjects can be reassured of the strength of this new data protection test and of their rights.
Again, it is important to emphasise that getting the clause right is absolutely essential, as it underpins the entire data protection regime in the country. Getting it wrong could cost a huge amount, rendering the Bill, the UK GDPR and the Data Protection Act 2018 essentially useless. It is likely that the Government do not intend to undermine their own regulatory framework. Reinserting the article would confirm that in the Bill, offering complete clarity that the new data protection test will not result in lower levels of protection for UK data subjects.
We completely agree with the hon. Lady that we would not wish to see data transferred to countries that have an inferior data protection regime. However, we do not think amendment 104 is required to achieve that, because the reforms in chapter 5 already provide for a clear and high standard of protection when transferring personal data overseas. It states that the standard of protection in that country must not be “materially lower” than the standard under the UK GDPR. That ensures that high standards of data protection are maintained. In addition, we feel that the amendment would return us to the confusion of the existing regime. At present, the legislative framework makes it difficult for organisations and others to understand what standard needs to be applied when transferring personal data internationally, with several terms used in the chapter and in case law. Our reforms ensure that a clear standard applies, which maintains protection for personal data.
The hon. Lady raised the EU’s data adequacy assessment. That is something that featured earlier in our debates on the Bill, and, as we heard from a number of our witnesses, including the information commissioner, there is no reason to believe that this in any way jeopardises the EU’s assessment of the UK’s data adequacy.
Government amendment 24 revises new article 45B(3)(c) of the UK GDPR, which is inserted by schedule 5 and which makes provision about the data protection test that must be satisfied for data bridge regulations to be made. An amendment to the Bill is required for the Secretary of State to retain the flexibility to make data bridge regulations covering transfers from the UK or elsewhere. The amendment will preserve the status quo under the current regime, in which the Secretary of State’s power is not limited to covering only transfers from the UK. In addition to these amendments, four other minor and technical Government amendments —25, 26, 28 and 29—were tabled on 10 May.
Question put and agreed to.
Clause 21 accordingly ordered to stand part of the Bill.
Schedule 5
Transfers of personal data to third countries etc: general processing
Amendments made: 24, in schedule 5, page 147, line 3, leave out “from the United Kingdom” and insert
“to the country or organisation by means of processing to which this Regulation applies as described in Article 3”.
New Article 45B(3)(c) of the UK GDPR explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.
Amendment 25, in schedule 5, page 147, line 12, leave out
“the transfer of personal data”
and insert “transfer”.
This amendment and Amendment 26 simplify the wording in new Article 45B(4)(b) of the UK GDPR.
Amendment 26, in schedule 5, page 147, line 14, leave out
“the transfer of personal data”
and insert “transfer”.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 25.
Schedule 5, as amended, agreed to.
Schedule 6
Transfers of personal data to third countries etc: law enforcement processing
Amendments made: 27, in schedule 6, page 155, line 39, leave out “from the United Kingdom” and insert—
“to the country or organisation by means of processing to which this Act applies as described in section 207(2)”.
New section 74AB(3)(c) of the Data Protection Act 2018 explains how references to processing of personal data in a third country should be read (in the data protection test for regulations approving international transfers of personal data). This amendment changes a reference to data transferred from the United Kingdom to include certain data transferred from outside the United Kingdom.
Amendment 28, in schedule 6, page 156, line 6, leave out
“the transfer of personal data”
and insert “transfer”.
This amendment and Amendment 29 simplify the wording in new section 74AB(4)(b) of the Data Protection Act 2018.
Amendment 29, in schedule 6, page 156, line 8, leave out
“the transfer of personal data”
and insert “transfer”.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 28.
Schedule 6, as amended, agreed to.
Schedule 7 agreed to.
Clause 22
Safeguards for processing for research etc purposes
I beg to move amendment 34, in clause 22, page 36, leave out lines 20 to 22.
This amendment and Amendment 37 transpose the requirement for processing of personal data for research, archiving and statistical purposes to be carried out subject to appropriate safeguards from the beginning to the end of new Article 84B of the UK GDPR.
With this it will be convenient to discuss the following:
Government amendments 35 to 39.
Clause stand part.
Clause 23 stand part.
Clause 22 creates a new chapter in the UK GDPR that provides safeguards for the processing of personal data for the purposes of scientific research or historical research, archiving in the public interest, and for statistical purposes. Currently, the provisions that provide safeguards for those purposes are spread across the UK GDPR and the Data Protection Act 2018.
Clause 22 consolidates those safeguards in a new chapter 8A of the UK GDPR. Those safeguards ensure that the processing of personal data for research, archiving and statistical purposes does not cause substantial damage or substantial distress and that appropriate technical and organisational measures are in place to respect data minimisation. Clause 23 sets out consequential changes to the UK GDPR and Data Protection Act 2018 required as a result of the changes being made in clause 22 to consolidate safeguards for research.
Government amendments 34 to 39 are minor, technical amendments clarifying that, as part of the pre-existing additional requirement when processing for research, archiving and statistical purposes, a controller is to use anonymous—rather that personal—data, unless that means that those purposes cannot be fulfilled. It makes clear that processing to anonymise the personal data is permitted. On that basis, I commend the clauses, and indeed the Government amendments, to the Committee.
With regards to clause 22, it is pleasing to see a clause confirming the safeguards that are applicable when processing under the new research and scientific purposes. For example, it is welcome that it is set out that such processing must not cause substantial damage or distress to a data subject, must respect the principle of data minimisation and must not make decisions related to a particular data subject unless it is for approved medical research.
Those safeguards are especially important given the concerns that I laid out over the definition of scientific research in clause 2, which could lead to the abuse of data under the guise of legitimate research. I have no further comments on the clause or the Government’s amendments to it at this stage, other than to reiterate that the definition of scientific research must have clear boundaries if any of the clauses that concern research are to be used as intended.
Clause 23 makes changes consequential on those in clause 22, so I refer to the substance of my remarks during the discussion of the previous clause.
Amendment 34 agreed to.
With this it will be convenient to discuss the following:
Amendment 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert
“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”
This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.
Clauses 25 and 26 stand part.
Clause 24 introduces an exemption that can be applied to the processing of personal data for law enforcement purposes under the law enforcement regime for the purposes of safeguarding national security. It will replace the current, more limited national security exemptions that exist in the law enforcement regime and mirror the existing exemptions in the UK GDPR and intelligence services regime.
The clause will allow organisations to exempt themselves from specified provisions in the law enforcement regime of the Data Protection Act 2018, such as some of the data protection principles and the rights of the individual, but only where it is necessary to do so for the purposes of safeguarding national security. Like the other exemptions in the Act, it must be applied on a case-by-case basis. There are limits to what the exemption applies to. The processing of data by law enforcement authorities must always be lawful, and the protections surrounding sensitive processing remain.
Subsection (2) amends the general processing regime of the Data Protection Act, regarding processing under UK GDPR, to remove the ability of organisations to exempt themselves, on the grounds of safeguarding national security, from article 77 of the UK GDPR, which provides the right for individuals to lodge a complaint with the Information Commissioner. That is because we do not consider exemption from that provision necessary. The change will align the national security exemption applicable to UK GDPR processing with the other national security exemptions in the Data Protection Act 2018, which do not permit the exemption to be applied in relation to an individual’s right to complain to the Commissioner.
The ability of a Minister of the Crown to issue a certificate certifying the application of the exemption for the purposes of safeguarding national security, which previously existed, is retained; clause 24(8) simply updates that provision to reflect the new exemption. That change will assist closer working between organisations operating under the three distinct data protection regimes by providing greater confidence that data that, for example, may be of importance to a police investigation but also pertinent to a separate national security operation can be properly safeguarded by both organisations. I will allow the hon. Member for Barnsley East to speak to amendment 105, because I wish to respond to her.
I am grateful to the Minister. I want to speak today about a concern that has been raised about clauses 24, 25 and 26, so I will address them before speaking to amendment 105.
In essence, the clauses increase the opportunities for competent authorities to operate in darkness when it comes to personal data through both national security certificates and designation notices. Though it may of course be important in some cases to adjust data protection regulation in a minimal way to protect national security or facilitate working with the intelligence services, important too is the right to understand how any competent authority is processing our personal data—particularly given the growing mistrust around police culture.
To cite one stark example of why data transparency in law enforcement is important, after Sarah Everard was murdered, more than 30 police officers were reportedly investigated for unnecessarily looking up her personal data. First, that demonstrates that there is a temptation for officers to access personal data without due reason, perhaps particularly when it is related to a high-profile case. Secondly, however, it shows that transparency does hold people accountable. Indeed, thankfully, the individuals who were accused of accessing the data were swiftly investigated. That would not have been possible if that transparency had been restricted—for example, had there been a national security certificate or a designation notice in place.
The powers to apply for the certificates and notices that allow the police and law enforcement authorities exemptions from data protection, although sometimes needed, must be used extremely sparingly and must be proportionate to the need to protect national security. However, that proportionate approach does not appear to be guaranteed in the Bill, despite it being a requirement in human rights law.
In their oral and written evidence, representatives from Rights and Security International warned that clauses 24 to 26 could actually violate the UK’s obligations under the Human Rights Act 1998 and the European convention on human rights. Everything that the UK does, including in the name of national security or intelligence services, must comply with human rights and the ECHR. That means that any time there is interference with the privacy of people in the UK—which is considered a fundamental right—for it to be lawful, the law in question must do only what is truly necessary for national security. That necessity standard is a high one, and it does not take into account whether a change might be more convenient for a competent authority.
Will the Minister clearly explain in what way the potential powers given to law enforcement under clauses 24 to 26, in both national security certificates and designation notices, would be strictly proportionate and necessary for national security, rather than simply making the operations of law enforcement easier and more convenient?
Primarily, the concern is for those whose data could be used in a way that fundamentally infringes on their privacy, but there are practical concerns too. Any clauses that contain suspected violations of human rights could set up the Government for lengthy legal battles, both in the UK and at the European Court of Human Rights, about their data protection and surveillance regimes. Furthermore, any harm to the UK’s important relationships with the EU around data could threaten the adequacy agreement which, as we have all repeatedly heard, is vital to our economy.
It is vital, then, that Minister confirms that both national security certificates and designation notices will be used only where necessary, and exemptions will be allowed only where necessary. If that cannot be satisfied, we must oppose the clauses.
I will now focus on amendment 105. Where powers are available to provide exemptions to privacy protections on grounds of national security, it is important that they are protected from exploitation, and not unduly concentrated in any individual’s hands without appropriate checks and balances. However, Rights and Security International warned that that was not taken into appropriate consideration in clause 25. Instead, the power to issue designation notices has been concentrated almost entirely in the hands of the Secretary of State, with no accountability measures built in.
Designation notices allow for joint processing between a qualifying competent authority and the intelligence services, which could have greatly beneficial consequences for tackling crime and threats to our national security, but they will also allow for both those parties to be exempt from what are usually crucial data protections. They must therefore be used sparingly, and only when necessary and proportionate.
As we have seen—and as I will argue countless times—we cannot rely on the Secretary of State’s acting in good faith. Our legislation must instead protect against a Secretary of State who acts in bad faith. Neither can we rely on the Secretary of State having the level of expertise needed to make complex and technical decisions, especially those that impact on national security and data rights at the same time.
Despite that, under clause 25(2), the Secretary of State alone can specify which competent authorities qualify as able to apply for a designation notice. Under subsection (3), it is the Secretary of state alone to whom qualifying competent authorities will jointly apply. It is the Secretary of State who reviews a notice and has the power to withdraw it, and it is the Secretary of State who makes transition arrangements.
Although there is a requirement in the Bill to consult the commissioner, the amendment seeks to formalise some independent oversight of the designation process by ensuring that the commissioner has an actual say in approving the notices and adjusting the concentration of power so that it does not lie solely in the Secretary of State’s hands. That would mean that should the Secretary of State act in bad faith, or lack the expertise needed to make such a decision—whether aware or unaware of this fact—the commissioner would be able to help to ensure that an informed and proportionate decision was made with regard to each notice applied for. This would not present any designation notices from being issued when they were genuinely necessary; it would simply safeguard their approval when they were.
I assure the hon. Lady that clauses 25 and 26 are necessary for the improvement of national security. The reports on events such as the Manchester and Fishmongers’ Hall terrorist incidents have demonstrated that better joined-up working between the intelligence services and law enforcement is in the public interest to safeguard national security. A current barrier to such effective joint working is that only the intelligence services can operate under part 4 of the Data Protection Act, which is drafted to reflect the unique operational nature of their processing.
Of course, the reports on incidents such as those at Fishmongers’ Hall and the Manchester Arena pointed to a general lack of effective collaboration between security forces and the police. It was not data that was the issue; it was collaboration.
I certainly accept that greater collaboration would have been beneficial as well, but there was a problem with data sharing and that is what the clause is designed to address.
As the hon. Member for Barnsley East will know, law enforcement currently operates under part 3 of the Data Protection Act when processing data for law enforcement purposes. That means that even when they work together, law enforcement and the intelligence services must each undertake separate assessments regarding the same joint-working processing.
Order. I am making a habit of interrupting the Minister—I do apologise—but we have some news from the Whip.
Ordered, That the debate be now adjourned.—(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesI remind the Committee that with this we are discussing the following:
Amendment 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert
“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”
This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.
Clauses 25 and 26 stand part.
When the Committee last adjourned, I had already spoken to clauses 24 to 26 and was responding to amendment 105, which was tabled by the hon. Member for Barnsley East. However, let me give a quick recap.
Clauses 24 to 26 are essentially designed to enable better joined-up working between the intelligence services and law enforcement. To that end, they will allow qualifying authorities to use part 4 of the data protection regime, but the Secretary of State will be required to issue a designation notice. We believe that enabling qualifying competent authorities to jointly process data under one regime in authorised, specific circumstances will allow better control over data in a way that is not possible under two different data protection regimes.
Amendment 105 seeks to increase the role of the Information Commissioner’s Office by requiring it to judge whether the designation notice is required for the purposes of safeguarding national security. The Bill requires the Secretary of State to consult the ICO as part of the Secretary of State’s decision whether to grant a notice, but it is not the function of the ICO in its capacity as a regulator to assess national security requirements. The ICO’s expertise is in data protection, not in national security, and it would be inappropriate for it to decide on the latter; that decision should be reserved to the Secretary of State. We believe that clause 25 provides significant safeguards through proposed new sections 82B and 82E, which provide respectively for legal challenge and annual review of a notice. In addition, should the notice no longer be required, the Secretary of State can withdraw it. For that reason, we cannot accept the amendment.
I spoke to amendment 105 in our last sitting. In summary, the Bill contains a requirement to consult the commissioner. The amendment seeks to formalise some of the independent oversight of the designation notice process so that the power does not lie solely in the Secretary of State’s hands. The matter of the Secretary of State’s power is obviously something with which we take issue throughout the Bill. The amendment would not stop any designation notice being issued where it is genuinely necessary; it would simply add a safeguard for its approval where it is not. For that reason, I will press the amendment to a vote.
Question put and agreed to.
Clause 24 accordingly ordered to stand part of the Bill.
Clause 25
Joint processing by intelligence services and competent authorities
Amendment proposed: 105, in clause 25, page 44, line 6, leave out “must consult the Commissioner” and insert
“must apply to the Commissioner for authorisation of the designation notice on the grounds that it satisfies subsection (1)(b).”—(Stephanie Peacock.)
This amendment seeks to increase independent oversight of designation notices by replacing the requirement to consult the Commissioner with a requirement to seek the approval of the Commissioner.
Question put, That the amendment be made.
We now come to the provisions in the Bill relating to the powers of the Information Commissioner. Clause 27 will introduce a new strategic framework for the Information Commissioner when carrying out his functions under data protection legislation. The framework contains a principal data protection objective and a number of general duties.
The legislation does not currently provide the commissioner with a framework of strategic objectives to help to prioritise activities and resources, evaluate performance and be held accountable by stakeholders. Instead, the commissioner is obliged to fulfil a long list of tasks and functions without a clear strategic framework to guide his work.
The clause introduces a principal objective for the commissioner, first to secure an appropriate level of protection for personal data, taking into account the interests of data subjects, controllers and others along with matters of general public interest, and secondly to promote public trust and confidence in the processing of personal data. This principal objective will replace section 2(2) of the Data Protection Act 2018.
How does the Minister think the words
“an appropriate level of protection for personal data”
should be understood by the Information Commissioner? Is it in the light of the duties that follow, or what?
Obviously that is a matter for the Information Commissioner, but that is the overriding principal objective. I am about to set out some of the other objectives that the clause will introduce, but it is made very clear that the principal objective is to ensure the appropriate level of protection. Precisely how the Information Commissioner interprets “appropriate level of protection” is a matter for him, but I think it is fairly clear what that should entail, as he himself set out in his evidence.
As I have said, clause 27 introduces new duties that the commissioner must consider where they are relevant to his work in carrying out data protection functions: the desirability of promoting innovation and competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; the need to safeguard public security and national security; and, where necessary, the need to consult other regulators when considering how the ICO’s work may affect economic growth, innovation and competition. There is also the statement of strategic priorities, which is introduced by clause 28. However, as I have indicated to the hon. Member for Newcastle upon Tyne Central, the commissioner will be clear that his primary focus should be to achieve the principal objective.
Clause 27 also introduces new reporting requirements for the commissioner in relation to the strategic framework. The commissioner will be required to publish a forward-looking strategy outlining how he intends to meet the new principal objective and duties, as well as pre-existing duties in the Deregulation Act 2015 and the Legislative and Regulatory Reform Act 2006.
Finally, the commissioner will be required to publish a review of what he has done to comply with the principal objective, and with the new and existing duties, in his annual report.
I wonder whether part of the strategy might include a list of fees that could potentially be charged for accessing data. This idea of fees seems to be quite vague in terms of amounts and levels, so it would be useful to have some more information on that.
I think we will come on to some of the questions around the fees that are potentially payable, particularly by those organisations that may be required to provide more evidence, and the costs that that could entail. I will return to that subject shortly.
The new strategic framework acknowledges the breadth of the ICO’s remit and its impact on other areas. We believe that it will provide clarity for the commissioner, businesses and the general public on the commissioner’s objectives and duties. I therefore commend clause 27 to the Committee.
The importance to any data protection regime of an independent, well-functioning regulator cannot be overstated. The ICO, which is soon to be the Information Commission as a result of this Bill, is no exception to that rule. It is a crucial piece of the puzzle in our regime to uphold the information rights set out in regulation. Importantly, it works in the interests of the general public. The significance of an independent regulator is also recognised by the European Commission, which deems it essential to any adequacy agreement. The general duties of our regulator, such as those set out in this clause, are therefore vital because they form the foundations on which it operates and the principles to which it must be accountable.
Although the duties are more an indicator of overarching direction than a prescriptive list of duties, they should still aim to reflect the wide range of tasks that the regulator carries out and the values with which they do so. On the whole, the clause does this well. Indeed, the principal objective for the commissioner set out in this clause, which is
“to secure an appropriate level of protection for personal data, having regard to the interests of data subjects, controllers and others and matters of general public interest, and…to promote public trust and confidence in the processing of personal data”
is a good overarching starting point. It simply outlines the basic functions of the regulator that we should all be able to get behind, even if the Bill itself does disappointingly little to encourage the promotion of public trust in data processing.
It is particularly welcome that the principal objective includes specific regard to
“matters of general public interest.”
This should cover things like the need to consider sustainability and societal impact. However, it is a shame that that is not made explicit among the sub-objectives, which require the commissioner to have regard to the likes of promoting innovation and safeguarding national security. That would have ingrained in our culture a desire to unlock data for the wider good, not just for the benefit of big tech. Overall, however, the responsibilities set out in the clause, and the need to report on fulfilling them, seem to reflect the task and value of the regulator fairly and accurately.
I think that was slightly qualified support for the clause. Nevertheless, we welcome the support of the Opposition.
Question put and agreed to.
Clause 27 accordingly ordered to stand part of the Bill.
Clause 28
Strategic priorities
Clause 28 provides a power for the Secretary of State to prepare a statement of strategic priorities relating to data protection as part of the new strategic framework for the Information Commissioner. The statement will contain only the Government’s data protection priorities, and the Secretary of State may choose to include both domestic and international priorities. That will enable the Government to provide a transparent statement of how their data protection priorities fit in with their wider agenda, giving the commissioner, we hope, helpful context.
Although the commissioner must take the statement into account when carrying out his functions, he is not required to act in accordance with it. That means that the statement will not be used in a way to direct what the commissioner may and may not do. Once the statement is drafted, the Secretary of State will be required to lay it before Parliament, where it will be subject to the negative resolution procedure before it can be designated. The commissioner will need to consider the statement when carrying out functions under the data protection legislation, except functions relating to a particular person, case or investigation.
Once designated, the commissioner will be required to respond to the statement, outlining how he intends to consider it in future data protection work. The commissioner will also be required to report on how he has considered the statement in his annual report. I commend the clause to the Committee.
Clause 28 requires that every three years the Secretary of State publish a statement of strategic priorities for the commissioner to consider, respond to, and have regard to. The statement would be subject to the negative resolution procedure in Parliament, and the commissioner would be obliged to report on what they have done to comply with it annually. Taken in good faith, I see what the clause was intended to achieve. It is, of course, important that the Government’s data priorities are understood by the commissioner. It is also vital that we ensure that the regulator functions in line with the most relevant issues of the day, given the rapidly evolving landscape of technology.
A statement of strategic priorities could, in theory, allow the Government to set out their priorities on data policy in a transparent way, allowing both Ministers and the ICO to be held accountable for their relationship. However, there is and must be a line drawn between the ICO understanding the modern regulatory regime that it will be expected to uphold and political interference in the activities and priorities of the ICO. The Open Rights Group, among others, has expressed concern that the introduction of a statement of strategic priorities could cross that line, exposing the ICO to political direction, making it subject to culture wars and leaving it vulnerable to corporate capture or even corruption.
Although the degree to which those consequences would become a reality given the current strength of our regulator might be up for debate, the very concept of the Government setting out a statement of strategic priorities that must be adhered to by the commissioner at the very least sets out a need for the ICO to follow some sort of politically led direction, something that seems counterintuitive with respect to independence. As I have already argued, an independent ICO is vital not only directly, for data subjects to be sure that their rights will be implemented and for controllers to be sure of their obligations, but indirectly, as a crucial component of our EU adequacy agreement.
Even though the clause may not be intended to threaten independence, we must be extremely careful not to unintentionally embark on a slippery slope, particularly as there are other mechanisms for ensuring that the ICO keeps up with the times and has a transparent relationship with Government. In 2022, the ICO published its new strategic plan, ICO25, which sets out why its work is important, what it wants to be known for and by whom, and how it intends to achieve that by 2025. It describes the ICO’s purpose, objectives and values and the shift in approach that it aims to achieve through the life of the plan, acknowledging that its work is
“complex, fast moving and ever changing.”
The plan was informed by extensive stakeholder consultation and by the responsibilities that the ICO has been given by Parliament. There are therefore ways for the ICO to communicate openly with Government, Parliament and other relevant stakeholders to ensure that its direction is in keeping with the most relevant challenges and with updates to legislation and Government activity. Ministers might have been better off encouraging transparent reviews, consultations and strategies of that kind, rather than prompting any sort of interference from politicians with the ICO’s priorities.
We agree about the importance of the independence of the Information Commissioner, but I do not think that the statement, as we have set out, is an attempt to interfere with that. I remind the hon. Lady that in relation to the statement of strategic priorities, she asked the Information Commissioner himself:
“Do you perceive that having any impact on your organisation’s ability to act independently of political direction?”,
and he replied:
“No, I do not believe it will undermine our independence at all.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 6, Q3.]
The Minister is right to quote the evidence session, but he will perhaps also remember that in a later session Ms Irvine from the Law Society of Scotland said that she was surprised by the answer given by the Information Commissioner.
Ms Irvine may have been surprised. I have to say that we were not. What the Information Commissioner said absolutely chimed with our view of the statement, so I am afraid on this occasion I will disagree with the Law Society of Scotland.
Question put, That the clause stand part of the Bill.
With this it will be convenient to discuss:
Clause 30 stand part.
Amendment 111, in clause 31, page 56, line 30, leave out lines 30 and 31 and insert—
“(6) If the Commissioner submits a revised code under subsection (5)(b), the Secretary of State must approve the code.”
This amendment seeks to limit the ability of the Secretary of State to require the Commissioner to provide a revised code to only one occasion, after which the Secretary of State must approve the revised code.
Clause 31 stand part.
Given the significant number of ways in which personal data can be used, we believe that it is important that the regulator provides guidance for data controllers, particularly on complex and technical areas of the law, and that the guidance should be accessible and enable compliance with the legislation efficiently and easily. We are therefore making a number of reforms to the process by which the Information Commissioner produces statutory codes of practice.
Clause 29 is a technical measure that ensures that all statutory codes of practice issued under the Data Protection Act 2018 follow the same parliamentary procedures, have the same legal effect, and are published and kept under review by the Information Commissioner. Under sections 121 to 124 of the Data Protection Act, the commissioner is obliged to publish four statutory codes of practice: the data sharing code, the direct marketing code, the age-appropriate design code, and the data protection and journalism code. The DPA includes provisions concerning the parliamentary approval process, requirements for publication and review by the commissioner, and details of the legal effect of each of the codes. So far, the commissioner has completed the data sharing code and the age-appropriate design code.
Section 128 of the Act permits the Secretary of State to make regulations requiring the Information Commissioner to prepare other codes that give guidance as to good practice in the processing of personal data. Those powers have not yet been used, but may be useful in the future. However, due to the current drafting of the provisions, any codes required by regulations made by the Secretary of State and issued by the commissioner would not be subject to the same formal parliamentary approval process or review requirements as the codes issued under sections 121 to 124. In addition, they do not have the same legal effect, and courts and tribunals would not be required to take a relevant provision of the code into account when determining a relevant question. Clearly, it is not appropriate to have two different standards of statutory codes of practice. To address that, clause 29 replaces the original section 128 with new section 124A, so that codes required in regulations made by the Secretary of State follow a similar procedure to codes issued under sections 121 to 124.
New section 124A provides the Secretary of State with the power to make regulations requiring the commissioner to produce codes of practice giving guidance as to good practice in the processing of personal data. Before preparing any code, the commissioner must consult the Secretary of State and other interested parties such as trade associations, data subjects and groups representing data subjects. That is similar to the consultation requirements for the existing codes. The parliamentary approval processes and requirements for the ICO to keep existing codes under review are also extended to any new codes required by the Secretary of State. The amendment also ensures that those codes requested by the Secretary of State have the same legal effect as those set out on the face of the DPA.
Clauses 30 and 31 introduce reforms to the process by which the commissioner develops statutory codes of practice for data protection. They require the commissioner to undertake and publish impact assessments, consult with a panel of experts during the development of a code, and submit the final version of a code to the Secretary of State for approval. Those processes will apply to the four statutory codes that the commissioner is already required to produce and to any new statutory codes on the processing of personal data that the commissioner is required to prepare under regulation made by the Secretary of State.
The commissioner will be required to set up and consult a panel of experts when drafting a statutory code. That panel will be made up of relevant stakeholders and, although the commissioner will have discretion over its membership, he or she will be required to explain how the panel was chosen. The panel will consider a draft of a statutory code and submit a report of its recommendations to the commissioner. The commissioner will be required to publish the panel’s response to the code and, if he chooses not to follow a recommendation, the reasons must also be published.
Clause 30 also requires the commissioner to publish impact assessments setting out who will be affected by the new or amended code and the impact it will have on them. While the commissioner currently carries out impact assessments when developing codes of practice, we believe that there are advantages to formalising an approach on the face of the legislation to ensure consistency.
Given the importance of the statutory codes, we believe it is important that there is a further degree of democratic accountability within the process. Therefore, clause 31 requires the commissioner to submit the final version of a statutory code to the Secretary of State for approval.
On that basis, I commend the relevant clauses to the Committee, but I am aware that the hon. Member for Barnsley East wishes to propose an amendment.
I turn first to clauses 29 and 30. Codes of practice will become increasingly important as the remit of the ICO expands and modernises. As such, it is important that the codes are developed in a way that is conducive to the product being as effective and useful as possible.
Although the ICO already carries out impact assessments for new codes of practice, that is only done as best practice and currently does not have any statutory underpinning. It is therefore pleasing to see clauses that will require consistency and high standards when developing new codes, ensuring that the resulting products are as comprehensive and helpful as possible. It is welcome, for example, to see that experts will be consulted in the process of developing these codes, including Government officials, trade associations and data subjects. It is also good to see that the commissioner will be required to publish a statement relating to the establishment of the expert panel, including how and why members were selected.
I welcome the support of the Opposition for many of the principles contained in the clauses. I turn to amendment 111, tabled by the hon. Lady. As the clause originally sets out, once the commissioner is issued the final version of the code, the Secretary of State decides whether to approve it. If they do approve the code, it will be laid before Parliament for final approval. If they do not, they are required to publish their reasons.
The amendment would place a limit on that, so that the Secretary of State would be able to reject the final version of the code only once. If the code is revised by the commissioner in the light of the comments of the Secretary of State and resubmitted, under the amendment the Secretary of State would have to lay the code in Parliament for final approval. Although I understand the concern behind the amendment, we do not believe it to be justified. I understand that the hon. Lady does not want a code to be rejected multiple times, but we regard this as a final safeguard and it will be fully transparent. We are absolutely committed to maintaining the commissioner’s independence, but we think it also important that the Government have the opportunity to give a view before the code is laid before Parliament and for Parliament to give final approval. The amendment would unduly limit the Government’s ability to provide as necessary that further degree of democratic accountability.
The hon. Lady referred to the importance of maintaining adequacy, which we have already touched on. I fully share her view on its importance to the wider functioning of the economy, but when she raised the matter with the Information Commissioner he did not believe that it posed any risk. Indeed, he went on to point out:
“A failure of the Secretary of State to table and issue a proposed code would not affect the way in which the commissioner discharges his or her enforcement functions. We would still be able to investigate matters and find them in breach, regardless of whether that finding was consistent with the Secretary of State’s view of the law.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 6-7, Q4.]
On that basis, we think that there should be the ongoing ability for the Secretary of State—and, through the Secretary of State, Parliament—to approve the final version of the code, but we do not feel that this interferes with the Information Commissioner’s ability to carry out his functions, nor does it represent any view as to our adequacy agreement.
The problem is that the Government are operating on the basis that everyone is acting in good faith, and although I am sure that the Minister and the current Secretary of State are doing so, we do not know what the future holds. It was incredibly encouraging that throughout the evidence sessions a number of witnesses said they did not feel that adequacy was at threat. That is welcome and reassuring, but only the EU Commission can give us adequacy. I am afraid the Minister simply has not done enough to alleviate my concerns about the independence of the ICO. I understand that the Minister disagrees with the Law Society of Scotland, but the full quote was:
“The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 74, Q156.]
As such, I will push my amendment to a vote.
Question put and agreed to.
Clause 29 accordingly ordered to stand part of the Bill.
Clause 30 ordered to stand part of the Bill.
Clause 31
Codes of practice: approval by the Secretary of State
Amendment proposed: 111, in clause 31, page 56, line 30, leave out lines 30 and 31 and insert—
“(6) If the Commissioner submits a revised code under subsection (5)(b), the Secretary of State must approve the code.”—(Stephanie Peacock.)
This amendment seeks to limit the ability of the Secretary of State to require the Commissioner to provide a revised code to only one occasion, after which the Secretary of State must approve the revised code.
Question put, That the amendment be made.
Taking advantage of your invitation, Mr Hollobone, I shall speak only briefly. The UK’s data protection framework allows a data subject or data protection officer to make a request to the Information Commissioner for information concerning the exercise of their data protection rights. The commissioner is expected to respond to a data subject or data protection officer and make no charge in the majority of cases, but the commissioner can refuse to respond or charge a reasonable fee for a response to a request when it is “manifestly unfounded or excessive”. Clause 7 changes the “manifestly unfounded or excessive” threshold for all requests from data subjects across the UK data protection framework to “vexatious or excessive”. Clause 32 replicates that language, inserting the same new threshold into section 135 of the Data Protection Act 2018, to ensure that the Information Commissioner’s exemption is consistent across the legislation. I urge the Committee to agree to the clause.
The new threshold contained in the clause has been discussed in debates under clause 7, and I refer hon. Members to my remarks in those debates, as many of the same concerns apply. The guidance that will be needed to interpret the terms “vexatious” and “excessive” should be no less applicable to the Information Commissioner, whose co-operation with data subjects and transparency should be exemplary, not least because the functioning of the regulator inherently sets an example for other organisations on how the rules should be followed.
Question put and agreed to.
Clause 32, as amended, accordingly ordered to stand part of the Bill.
Clause 33
Analysis of performance
Question proposed, That the clause stand part of the Bill.
Clause 33 introduces the requirement for the Information Commissioner to prepare and publish an analysis of their performance, using key performance indicators. The regulator will be required to publish that analysis at least annually. The commissioner will have the discretion to decide which factors effectively measure their performance.
Improving the commissioner’s monitoring and reporting mechanisms will strengthen their accountability to Parliament, organisations and the public, who have an interest in the commissioner’s effectiveness. Performance measurement will also have benefits for the commissioner, including by supporting their work of measuring progress towards their objectives and ensuring that resources are prioritised in the right areas. I urge that clause 33 stand part of the Bill.
I welcome the clause, as did the majority of respondents who supported the proposal in the “Data: a new direction” consultation. As recognised by the Government’s response to their consultation, respondents felt the proposal would allow for the performance of the ICO to be assessed publicly and provide evidence of how the ICO is meeting its statutory obligations. We should do all we can to promote accountability, transparency and public awareness of the obligations and performance of the ICO. The clause allows for just that.
Question put and agreed to.
Clause 33 accordingly ordered to stand part of the Bill.
Clause 34
Power of the Commissioner to require documents
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Clauses 35 to 38 stand part.
Government amendment 47.
Clause 42 stand part.
This is a slightly chunkier set of clauses and amendments, so I will not be as brief as in the last two debates.
Clause 34 is a clarificatory amendment to the Information Commissioner’s powers in section 142 of the Data Protection Act to require information. Its purpose is to clarify the commissioner’s existing powers to put it beyond doubt that the commissioner can require specific documents as well as information when using the information notice power. Subsections (3) to (7) of the clause make consequential amendments to references to information notices elsewhere in the Data Protection Act.
Clause 35 makes provision for the Information Commissioner to require a data controller or processor to commission a report from an approved person on a specified matter when exercising the power under section 146 of the Data Protection Act to issue an assessment notice. The aim of the power is to ensure that the regulator can access information necessary to its investigations.
In the event of a data breach, the commissioner is heavily dependent on the information that the organisation provides. If it fails to share information—for example, because it lacks the capability to provide it—that can limit the commissioner’s ability to conduct a thorough investigation. Of course, if the organisation is able to provide the necessary information, it is not expected that the power would be used. The commissioner is required to act proportionately, so we expect that the power would be used only in a small minority of investigations, likely to be those that are particularly complex and technical in nature.
Clause 36 grants the Information Commissioner the power to require a person to attend an interview and answer questions when investigating a suspected failure to comply with data protection legislation. At the moment, the Information Commissioner can only interview people who attend voluntarily, which means there is a heavy reliance on documentary evidence. Sometimes that is ambiguous or incomplete and can lead to uncertainty. The ability to require a person to attend an interview will help to explain an organisation’s practices or evidence submitted, and circumvent a protracted and potentially fruitless series of back-and-forth communication via information notices. The power is based on existing comparable powers for the Financial Conduct Authority and the Competition and Markets Authority.
Clause 37 amends the provisions for the Information Commissioner to impose penalties set out in the Data Protection Act. It will allow the commissioner more time, where needed, to issue a final penalty notice after issuing a notice of intent. At the moment the Act requires the commissioner to issue a notice of intent to issue a penalty notice; the commissioner then has up to six months to issue the penalty notice unless an extension is agreed. That can prove difficult in some cases—for instance, if the organisation under investigation submits new evidence that affects the case at a late stage, or when the legal representations are particularly complex. The clause allows the regulator more time to issue a final penalty notice after issuing a notice of intent, where that is needed. That will benefit business, as it means the commissioner can give organisations more time to prepare their representations, and will result in better outcomes by ensuring that the commissioner has sufficient time to assess representations and draw his conclusions.
Clause 38 introduces the requirement for the Information Commissioner to produce and publish an annual report on regulatory activity. The report will include the commissioner’s investigatory activity and how the regulator has exercised its enforcement powers. That will lead to greater transparency of the commissioner’s regulatory activity.
Clauses 34 to 37, as I said, make changes to the Data Protection Act 2018 in respect of the Information Commissioner’s enforcement powers. Consequential on clauses 35 and 36, clause 42 makes changes to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016, known as the EITSET regulations. The EITSET regulations extend and modify the Information Commissioner’s enforcement powers to apply to its role as the supervisory body for trust service providers under the UK regulations on electronic identification and trust services for electronic transactions, known as the UK eIDAS. Clause 42 amends the EITSET regulations to ensure that the new enforcement powers introduced by clauses 34 to 37 are available to the Information Commissioner for the purposes of regulating trust service providers.
The new powers will help to ensure that the Information Commissioner is able to access the evidence needed to inform investigations. The powers will result in more informed investigations and, we believe, better outcomes. Clause 42 ensures that the Information Commissioner will continue to be able to act as an effective supervisory body for trust service providers established in the UK.
Government amendment 47 amends schedule 2 to the EITSET regulations. The amendment 2 is consequential to the amendment of section 155(3)(c) of the Data Protection Act made by schedule 4 to the Bill. The amendment to schedule 2 will remove the reference to consultation under section 65 of the Data Protection Act when section 155 is applied. It is necessary to remove reference to section 65 of the Data Protection Act when section 155 is applied with modification under schedule 2, as consultation requirements under that section are not relevant to the regulation of trust service providers under the UK eIDAS.
I hope that that is helpful to Members in explaining the merits of our approach to ensuring that the Information Commissioner has the right enforcement tools at its disposal and continues to be an effective and transparent regulator. I commend the clauses and Government amendment 47 to the Committee.
I will speak to each of the relevant clauses in turn. On clause 34, I am satisfied that the clarification that the Information Commissioner can require documents as well as information is necessary and will be of use to the regulator. I am pleased therefore pleased to accept the clause as drafted and to move on to the other clauses in this part.
Clause 35 provides for the commissioner to require an approved person to prepare a report on a specified matter, as well as to provide statutory guidance on, first, the factors it considers when deciding to require such a report and, secondly, the factors it considers when determining whom the approved person might be. That power to commission technical reports is one that the vast majority of respondents to the “Data: a new direction” consultation supported, as they felt it would lead to better informed ICO investigations. Any measures that help the ICO to carry out its duties rigorously and to better effect, while ensuring that relevant safeguards apply, are measures that I believe Members across the Committee will want to support.
In the consultation, however, the power was originally framed to commission a “technical report”, implying that it would be limited to particularly complex and technical investigations where there is significant risk of harm or detriment to data subjects. Although the commissioner is required to produce guidance on the circumstances in which a report might be required, I would still like clarification from the Minister of why such a limit was not included in the Bill as drafted. Does he expect it to be covered by the guidance produced by the ICO? Such a clarification is necessary not because we are against clause 35 in principle, just in acknowledgement that ICO’s powers—indeed, enforcement powers generally—must always be proportionate to the task at hand.
Furthermore, some stakeholders have said that it is unclear whether privilege will attach to reports required by the ICO and whether they may be disclosable to third parties who request copies of them. Greater clarity about how the power will operate in practice would therefore be appreciated.
Turning to clause 36, it is a core function of the ICO to monitor and enforce the UK’s data protection legislation and rules, providing accountability against the activities of all controllers, processors and individuals. To fulfil that function, the ICO may have to conduct an investigation to establish a body of evidence and determine whether someone has failed to comply with the legislation. The Government’s consultation document said that the ICO sometimes faces problems engaging organisations in those investigations, despite their having a duty to co-operate fully, especially in relation to interviews, as many people are nervous of negative consequences in their life or career if they participate in one. However, interviews are a crucial tool for investigations, as not all the relevant evidence will be available in written form. Indeed, that may become even more the case after the passing of this Bill, due to the reduced requirements to keep records, conduct data protection impact assessments and assign data protection officers—all of which contribute to a larger pool of documentation tracking data processing.
Clause 36, which will explicitly allow the ICO to compel witnesses to comply with interviews as part of an investigation, will, where necessary, ensure that as much relevant evidence as possible is obtained to inform the ICO’s judgment. That is something that we absolutely welcome. It is also welcome to see the safeguards that will be put in place under this clause, including the right not to self-incriminate and exemptions from giving answers that would infringe legal professional privilege or parliamentary privilege. That will ensure that the investigatory powers of the ICO stay proportionate to the issues at hand. In short, clause 36 is one that I am happy to support. After all, what is the purpose of us ensuring that data protection legislation is fit for purpose here today if the ICO is unable to actually determine whether anyone is complying?
On clause 37, it seems entirely reasonable that the ICO may require more than the standard six months to issue a penalty notice in particularly complex investigations. Of course, it remains important that the operations of the ICO are not allowed to slow unduly in cases where a penalty can be issued in the usual timeframe, but where the subject matter is particularly complicated, it makes sense to allow the ICO an extension to enable the investigation to be concluded in the proper, typically comprehensive manner. Indeed, complex investigations may be more common as we adjust to the new data legislation and a rapidly evolving technological landscape. By conducting the investigations properly and paying due attention to particularly technical issues, new precedents can be set that will speed up the regulator’s processes on the whole. Clause 37 is therefore welcomed by us, as it was by the majority of respondents to the Government’s consultation.
Turning to clause 38, as we have said multiple times throughout the progress of this Bill and in Committee, transparency and data protection should go hand in hand. Requiring the ICO to publish information each year on the investigations it has undertaken and the powers it has used will embed a further level of transparency into the regulatory system. Transparency breeds accountability, and requiring the regulator to publish information on the powers it is using will encourage such powers to be used proportionately and appropriately. Publishing an annual report with that information should also give us a better idea of how effectively the new regulatory regime is working. For example, a high volume of cases on a recurring issue could indicate a problem within the framework that needs addressing. Overall, it is welcome that Parliament and the public should be privy to information about how the ICO is discharging its regulatory functions. As a result, I am pleased to support clause 38.
Finally, the amendments to clause 42 are of a consequential nature, and I am happy to proceed without asking any further questions about them.
I am most grateful to the hon. Lady for welcoming the vast majority of the provisions within these clauses. She did express some concern about the breadth of the powers available to the Information Commissioner, but I point out that they are subject to a number of safeguards defining how they can be used. The commissioner is required to publish how he will exercise his powers, and that will provide organisations with clarity on the circumstances in which they are to be used.
As the hon. Lady will be aware, like other regulators, the Information Commissioner is subject to the duty under the Legislative and Regulatory Reform Act to exercise their functions
“in a way which is transparent, accountable, proportionate and consistent”,
and,
“targeted only at cases in which action is needed.”
There will also be a right of appeal, which is consistent with the commissioner’s existing powers. On that basis, I hope that the hon. Lady is reassured.
Question put agreed to.
Clause 34 accordingly ordered to stand part of the Bill.
Clauses 35 to 38 ordered to stand part of the Bill.
Clause 39
Complaints to controllers
With this it will be convenient to discuss the following:
Clauses 40 and 41 stand part.
That schedule 8 be the Eighth schedule to the Bill.
These three clauses, together with schedule 8, streamline and clarify complaint routes for data subjects by making the respective rights and responsibilities of data controllers and data subjects clear in legislation. The measures will reduce the volume of premature complaints to the Information Commissioner, and give an opportunity to controllers to resolve complaints before they are escalated to the regulator.
Clause 39 enables data subjects to complain to a data controller if they believe that there has been an infringement of their data protection rights, and creates a duty for data controllers to facilitate the making of complaints by taking appropriate steps, such as providing a complaints form. The requirement will encourage better conversations and more dialogue between data subjects and data controllers. It will formalise best practice, and align with the standard procedures of other ombudsman services, which require complainants to seek to resolve an issue with the relevant organisation before escalation. The clause also introduces a regulation-making power for the Secretary of State to require controllers to notify the Information Commissioner of the number of complaints made to them in circumstances specified in the regulations.
Clause 40 provides the Information Commissioner with a new power to refuse to act on certain data protection complaints if certain conditions are met, specifically if the complaint has not been made to the relevant controller; the controller has not finished handling the complaint and less than 45 days have elapsed since it was made; or the complaint is considered vexatious or excessive, as defined in the Bill. For example, that could be the case with a complaint that repeats a previous complaint made by the data subject to the commissioner. The power is in addition to the discretion that the commissioner can already exercise to “take appropriate steps” to respond to a complaint and investigate it “to the extent appropriate.” The clause requires the Information Commissioner to publish guidance about how it will respond to complaints and exercise its power to refuse to act on complaints. Finally, the clause also outlines the process for appeals if the commissioner refuses to act on a data protection complaint.
Clause 41 introduces schedule 8, which contains miscellaneous minor and consequential amendments to the UK General Data Protection Regulation and the Data Protection Act relating to complaints by data subjects.
Schedule 8 makes consequential amendments to the UK GDPR and the DPA relating to complaints by data subjects, which will ensure consistency across data protection legislation in relation to the changes to the complaints framework under clauses 39 and 40.
I will focus most of my remarks on the group on clauses 39 and 40, as clause 41 and schedule 8 contain mostly consequential provisions, as the Minister outlined.
There are two major sections to the clauses. First, they require a complainant to issue their complaint to the controller directly, through allowing the commissioner to refuse to process their complaint otherwise. Secondly, they require the commissioner to refuse any complaint that is vexatious or excessive. I will speak to both in turn.
As the ICO grows and its remit expands, given the rapidly growing use of data in our society, it makes sense that its resources should be focused where they are most needed. Indeed, when giving evidence to the Committee, the Information Commissioner and Paul Arnold of the ICO stated that their current duty to investigate all complaints is creating a burden on their resources. Therefore, the proposal to require that complainants reach out to their data controller first, before contacting the ICO, seems to make sense, as it will allow the regulator to move away from handling low-level complaints, or complaints that are under way but not yet resolved. Instead, it would be able to refocus resources into handling complaints that have been mishandled or that offer a serious threat to data rights and public trust in data use.
Though that may be seen by some businesses and controllers as shifting an extra requirement on to them, the move should be viewed overall as a positive one, as it will require controllers to have clear processes in place for handling complaints and hopefully incentivise against conducting the kind of unlawful processing that prompts complaints in the first place. Indeed, the ICO already encourages that type of best practice, with complainants often encouraged to speak directly with the relevant data controller first before seeking help from the regulator. The clause would therefore simply formalise the arrangement, providing clarity on three levels. First, it would ensure that data subjects are clear on their right to complain directly to the controller. Secondly, it would ensure that controllers are clear on their duty to respond to such complaints. Finally, the ICO would be certain of its ability to refuse a request if the complainant refuses to comply with that model.
Although it is vital that the ICO is able to modernise and direct efforts where they are most needed, it is also vital that a healthy relationship is kept between the public—as data and decision subjects—and the ICO. The public must feel that the commissioner is there to support them in exercising their rights or seeking redress where necessary, not least because lodging a complaint can already be a difficult and distressing process. Indeed, even the commissioner himself said, when he first assumed his role, that he wanted to
“make it easy for people to access remedies if things go wrong.”
As such, it is pleasing to see safeguards built into the clause that ensure a complainant can still escalate their complaint to the ICO, and appeal any refusal from the commissioner to a tribunal.
Data rights groups, such as the Open Rights Group, hold much more serious concerns about the ability to refuse vexatious and excessive requests. Indeed, they worry that the new power will allow the ICO to ignore widespread and systemic abuses of data rights. As was the case with subject access requests, the difference between a complaint made in anger—which is quite likely, given that the complainant believes they have suffered an abuse of their rights—and a vexatious one must be clearly distinguished. The ICO should not be able to reject complaints of data abuses simply because the complainant acts in ways caused by distress.
As the response of the Government to their consultation reveals, only about half of respondents agreed with the proposal to set out criteria by which the ICO can decide not to investigate a complaint. The safeguard to appeal any refusal from the commissioner is therefore crucial in ensuring that there is a clear pathway for data subjects and decision subjects to dispute the decision of the ICO. It is also right that they should be informed of that safeguard, as well as told why their complaint has been refused, and given the opportunity to complain again with a more complete picture of information.
Overall, the clauses seems to strike the right balance between ensuring safeguards for data and decision subjects while helping the ICO to modernise. However, terms such as “vexatious” and “excessive” must be clearly defined to ensure that the ICO is able to exercise this new power of refusal proportionately and sensibly.
I am looking for some clarification from the Minister. Under clause 39, it says:
“A controller must facilitate the making of complaints…such as providing a complaint form which can be completed electronically and by other means.”
Can the Minister clarify whether every data controller will have to provide an electronic means of making a complaint? For many small data controllers, which would include many of us in the room, providing an electronic means of complaint might require additional expertise and cost that they may not have. If it said, “and/or by other means”, which would allow a data controller to provide a paper copy, that might provide a little more reassurance to data controllers.
Let me address the point of the hon. Member for Glasgow North West first. The intention of the clause is to ensure that complainants go first to the data controller, and the data controller makes available a process whereby complaints can be considered. I certainly fully understand the concern of the hon. Lady that it should not prove burdensome, particularly for small firms, and I do not believe that it would necessarily require an electronic means to do so. If that is not the case, I will tell her, but it seems to me that the sensible approach would be for data controllers to have a process that the Information Commissioner will accept is available to complainants first, before a complaint is possibly escalated to the next stage.
With regard to the point of the hon. Member for Barnsley East, we have debated previously the change in the threshold to “vexatious” and “excessive”, and we may continue to disagree on that matter.
Question put and agreed to.
Clause 39 accordingly ordered to stand part of the Bill.
Clauses 40 and 41 ordered to stand part of the Bill.
Schedule 8 agreed to.
Clause 42
Consequential amendments to the EITSET Regulations
Amendment made: 47, Clause 42, page 72, line 12, at end insert—
“(7A) In paragraph 13 (modification of section 155 (penalty notices)), in sub-paragraph (3)(c), for “for “data subjects”” there were substituted “for the words from “data subjects” to the end”.”.—(Sir John Whittingdale.)
This amendment inserts an amendment of Schedule 2 to the EITSET Regulations which is consequential on the amendment of section 155(3)(c) of the Data Protection Act 2018 by Schedule 4 to the Bill.
Clause 42, as amended, ordered to stand part of the Bill.
Clause 43
Protection of prohibitions, restrictions and data subject’s rights
Question proposed, That the clause stand part of the Bill.
Clause 43 is a technical measure that creates a presumption that our data protection laws should not be overridden by future laws that relate to the processing of personal data, but it respects parliamentary sovereignty by ensuring that Parliament can depart from this presumption in particular cases if it deems it appropriate to do so. For example, if new legislation permitted or required an organisation to share personal data with another for a particular purpose, the default position in the absence of any specific indication to the contrary would be that the data protection legislation would apply to the new arrangement.
Will my right hon. Friend confirm that the provision will also apply with trade agreements? Certainly in the early stages of the negotiations for a UK-US trade agreement, the United States Government sought to include various provisions relating to tech policy. In such a scenario, would this legislation take precedence above anything written into a trade agreement?
That would certainly be my interpretation. I do not see that a trade agreement could possibly overturn an Act of Parliament unless Parliament specifically sets out that it intends that that should be the case. This is a general protection, essentially saying that in all future cases data protection legislation applies unless Parliament specifically indicates that that should not be the case.
Until now, ensuring that any new data protection measures are read consistently with the data protection legislation has relied either on inclusion of express provision to that effect in new data processing measures, or on general rules of interpretation. There are risks to that situation. Including relevant provisions in each and every new data processing provision is onerous and could be inadvertently omitted. General rules of interpretation can be open to different interpretations by courts, particularly in the light of legal challenges following our exit from the European Union. This can create the potential for legal uncertainty and as a result could lead to a less effective and comprehensive data protection legislative framework.
Clause 43 creates a presumption that any future legislation permitting the processing of personal data will be subject to the key requirements of the UK’s data protection legislation unless clear provisions are made to the contrary. This is a technical but necessary measure and I commend it to the Committee.
I understand that the clause contains legal clarifications relating to the interaction of data protection laws with other laws. On that basis, I am happy to proceed.
Question put and agreed to.
Clause 43 accordingly ordered to stand part of the Bill.
Clause 44
Regulations under the UK GDPR
Question proposed, That the clause stand part of the Bill.
The clause outlines the process and procedure for making regulations under powers in the UK GDPR. Such provision is needed because the Bill introduces regulation-making powers into the GDPR. There is an equivalent provision in section 182 of the Data Protection Act. Among other things, the clause makes it clear that, before making regulations, the Secretary of State must consult the Information Commissioner and such other persons as they consider appropriate, other than when the made affirmative procedure applies. In such cases, the regulations can be made before Parliament has considered them, but cannot remain as law unless approved by Parliament within a 120-day period.
I am sure that the Committee will be pleased to learn that we have now completed part 1 of the Bill. [Hon. Members: “Hear, hear!”]
Clause 46 provides an overview of the provisions in part 2 that are aimed at securing the reliability of digital verification services through a trust framework, a public register, an information gateway and a trust mark.
Clause 47 will require the Secretary of State to prepare and publish the digital verification services trust framework, a set of rules, principles, policies, procedures and standards that an organisation that wishes to become a certified and registered digital verification service provider must follow. The Secretary of State must consult the Information Commissioner and other appropriate persons when preparing the trust framework; that consultation requirement can be satisfied ahead of the clause coming into force. The Secretary of State must review the trust framework every 12 months and must consult the Information Commissioner and other appropriate persons when carrying out the review. I commend both clauses to the Committee.
Clause 46 defines digital verification services. Central to the definition, and to the framing of the debate on part 2, is the clarification that they are
“services that are provided at the request of an individual”.
That is a crucial distinction: digital verification services and the kinds of digital identity that they enable are not the same as any kind of Government-backed digital ID card, let alone a compulsory one. As we will discuss, it is important that any such services are properly regulated and can be relied on. However, the clause seems to set out a sensible definition that clarifies that all such services operate at individual request and are entirely separate from universal or compulsory digital identities.
I will speak in more depth about clause 47. As we move towards an increasingly digitally focused society, it makes absolute sense that someone should be able, at their own choice, to prove their identity online as well as in the physical world. Providing for a trusted set of digital verification services would facilitate just that, allowing people to prove with security and ease who they are for purposes including opening a bank account or moving house, akin to using physical equivalents like a passport or a proof of address such as a utility bill. It is therefore understandable that the Government, building on their existing UK digital identity and attributes trust framework, want to legislate so that the full framework can be brought into law when it is ready.
In evidence to the Committee, Keith Rosser highlighted the benefits that a digital verification service could bring, using his industry of work and employment as a live case study. He said:
“The biggest impact so far has been on the speed at which employers are able to hire staff”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 52, Q112.]
In a study of 70,000 hires, the digital identity route took an average time of three minutes and 30 seconds, saving about a week compared with having to meet with an employer in person to provide physical documents. That has benefits not only to the individuals, who can start work a week earlier, but to the wider economy, since the same people will start contributing to taxation and their local economy a week earlier too.
Secondly, Keith identified that digital verification could open up remote jobs to people living in areas where employment opportunities are harder to come by. In theory, someone living in my constituency of Barnsley East could be hired in a role that would previously have been available only in London, thanks to their ability to prove who they are without ever having to meet their employer in person.
In the light of those benefits, as well as the potential reduction in fraud from cutting down on the usability of fake documents, in principle it seems only logical to support a framework that would allow trusted digital verification services to flourish. However, the key is to ensure that the framework breeds the trust necessary to make it work. In response to the digital identity call for evidence in 2019, the Government identified that a proportion of respondents were concerned about their privacy when it came to digital verification, saying that without assurances on privacy protections it would be hard to build trust in those systems. It is therefore curious that the Government have not accompanied their framework with any principles to ensure that services are designed and implemented around user needs and that they reflect important privacy and data protection principles.
Can the Minister say why the Government have not considered placing the nine identity assurance principles on the statute book, for example, to be considered when legislating for any framework? Those principles were developed by the Government’s own privacy and consumer advisory group back in 2014; they include ensuring that identity assurance can take place only where consent, transparency, multiplicity of choice, data minimisation and dispute resolution procedures are in place. That would give people the reassurance to trust that the framework is in keeping with their needs and rights, as well as those of industry.
Furthermore, can the Minister explain whether the Government intend to ensure that digital verification will not be the only option in any circumstance, making it mandatory? As Big Brother Watch points out, digital identity is not a practical or desired option, particularly for vulnerable or marginalised groups. Elderly people may not be familiar with such technology, while others might be priced out of it, especially given the recent rise in the cost of broadband and mobile bills attached to inflation. Although we must embrace the opportunities that technology can provide in identity verification, there must also be the ability to opt out and use offline methods of identification where needed, or we will risk leaving people out of participating in key activities such as jobseeking.
Finally, I look forward to hearing more about the governance of digital verification services and the framework. The Bill does not provide a statutory basis for the new office for digital identities and attributes, and there is therefore no established body for the functions related to the framework. It is important that when the new office is established, there is good communication from Government about its powers, duties, functions and funding model. After all, the framework and the principles it supports are only as strong as their enforcement.
Overall, I do not wish to stand in the way of this part of the Bill, with the caveat that I am keen to hear from the Minister on privacy protections, on the creation of the new office and on ensuring that digital verification is the beginning of a new way of verifying one’s identity, not the end of any physical verification options.
It is a pleasure to follow my hon. Friend the Member for Barnsley East. I have some general comments, which I intend to make now, on the digital verification services framework introduced and set out in clause 46. I also have some specific comments on subsequent clauses; I will follow your guidance, Mr Hollobone, if it is your view that my comments relate to other clauses and should be made at a later point.
Like my hon. Friend, I recognise the importance of digital verification services and the many steps that the Government are taking to support them, but I am concerned about the lack of coherence between the steps set out in the Bill and other initiatives, consultations and activities elsewhere in Government.
As my hon. Friend said, the Government propose to establish an office for digital identities and attributes, which I understand is not a regulator as such. It would be good to have clarity on the position, as there is no discussion in the Bill of the duties of the new office or any kind of mechanisms for oversight or appeal. What is the relationship between the office for digital identities and attributes and this legislation? The industry has repeatedly called for clarity on the issue. I think we can all agree that a robust and effective regulatory framework is important, particularly as the Bill confers broad information-gathering powers on the Secretary of State. Will the Minister set out his vision and tell us how he sees the services being regulated, what the governance model will be, how the office—which will sit, as I understand it, in the Department for Science, Innovation and Technology—will relate to this legislation, and whether it will be independent of Government?
Will the Minister also help us to understand the relationship between the digital verification services set out in the Bill and other initiatives across Government on digital identity, such as the Government Digital Service’s One Login service, which we understand will be operated across Government services, and the initiatives of the Home Office’s fraud strategy? Is there a relationship between them, or are they separate initiatives? If they are separate, might that be confusing for the sector? I am sure the Minister will agree that we in the UK are fortunate to have world leaders in digital verification, including iProov, Yoti and Onfido. I hope the Minister agrees that for those organisations to continue their world-leading role, they need clarification and understanding of the direction of Government and how this legislation relates to that direction.
Finally, I hope the Minister will agree that digital identity is a global business. Will he say a few words about how he has worked with, or is working with, other countries to ensure that the digital verification services model set out in this legislation is complementary to other services and interoperable as appropriate, and that it builds on the learnings of other digital verification services?
I am grateful to the hon. Member for Barnsley East for setting out the Opposition’s general support for the principle of moving towards the facilitation of digital verification services. She set out some of the benefits that such services can provide, and I completely echo her points on that score. I reiterate the central point that none of this is mandatory: people can choose to use digital verification services, but there is no intention to make them compulsory.
The trust framework has been set out with a wide number of principles and standards, to which privacy is central. The hon. Member for Barnsley East is right that that will be necessary to obtain trust from people seeking to use the services. She and the hon. Member for Newcastle upon Tyne Central have both set out detailed questions about the operation of the new office and the work alongside other Government Departments. I would like to respond to their points but, given that we are about to break, we could accept the general principle of this clause and then discuss them, no doubt in greater detail, in the debate on subsequent clauses. Will the Committee accept this clause with the assurance that we will address a lot of the issues just raised as we come to subsequent clauses in this part of the Bill?
Question put and agreed to.
Clause 46 accordingly ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesClauses 48 to 52 provide the Secretary of State with powers and duties relating to the governance and oversight of digital identities in the UK. Those functions will be carried out by the office for digital identities and attributes. I can tell the hon. Member for Newcastle upon Tyne Central that the office is a team of civil servants in the Department for Science, Innovation and Technology. The office will oversee certified organisations that provide trusted digital verification services, to ensure that the purpose of the legislation is being upheld as the market develops.
I appreciate the Minister’s clarification that the office will be a group of civil servants, but I do not see that set out in the Bill, in the clause that we are currently debating. Am I wrong?
As the office is an internal body, within the Department, I do not think that it would necessarily be specifically identified in the legislation in that way. If there is any more information on that, I will be happy to provide it to the hon. Lady in a letter, but the office is not a separate body to the Department.
I thank the Minister for providing greater clarification, but if the office is not a separate body, it cannot be claimed to be independent of Government, which means that the governance of digital verification services is not independent. Will he confirm that?
This is a function that will operate within Government. I do not think that it is one where there is any specific need for particular independence, but as I said, I am happy to supply further details about precisely how it will operate if that is helpful to the hon. Lady.
Let me move on from the precise operation of the body. Clause 53 sets out requirements for certified digital verification service providers in relation to obtaining top-up certificates where the Secretary of State revises and republishes the DVS trust framework.
Clause 48 provides that the Secretary of State must establish and maintain a register of digital verification service providers. The register must be made publicly available. The Secretary of State is required to add a digital verification service provider to the register, provided that it has met certain requirements. To gain a place on the register, the provider must first be certified against the trust framework by an accredited conformity assessment body. Secondly, the provider must have applied to be registered in line with the Secretary of State’s application requirements under clause 49. Thirdly, the provider must pay any fee set by the Secretary of State under the power in clause 50.
The United Kingdom Accreditation Service accredits conformity assessment bodies as competent to assess whether a digital verification service meets the requirements set out in the trust framework. That, of course, is an arm’s length body. Assessment is by independent audits, and successful DVS providers are issued with a certificate.
The Secretary of State is prohibited from registering a provider if it has not complied with the registration requirements. An application must be rejected if it is based on a certificate that has expired, has been withdrawn by the issuing body, or is required to be ignored under clause 53 because the trust framework rules have been amended and the provider has not obtained a top-up certificate in time. The Secretary of State must also refuse to register a DVS provider if the provider was removed from the register through enforcement powers under clause 52 and reapplies for registration while still within the specified removal period.
Clause 48(7) provides definitions for “accredited conformity assessment body”, “the Accreditation Regulation”, “conformity assessment body” and “the UK national accreditation body”.
Clause 49 makes provision for the Secretary of State to determine the form of an application for registration in the digital verification services register, the information that an application needs to contain, the documents to be provided with an application and the manner in which an application is to be submitted.
Clause 50 allows the Secretary of State to charge providers a fee on application to be registered in the DVS register. The fee amount is to be determined by the Secretary of State. The clause also allows the Secretary of State to charge already registered providers ongoing fees. The amount and timing of those fees are to be determined by the Secretary of State.
Clauses 51 and 52 confer powers and duties on the Secretary of State in relation to the removal of persons from the register. Clause 51 places a duty on the Secretary of State to remove a provider from the register if certain conditions are met. That will keep the register up to date and ensure that only providers that hold a certificate to prove that they adhere to the standards set in the framework are included in the register. Clause 52 provides a power to the Secretary of State to remove a provider from the register if the Secretary of State is satisfied that the provider is failing to provide services in accordance with the trust framework, or if it has failed to provide the Secretary of State with information as required by a notice issued under clause 58. Clause 52 also contains safeguards in respect of the use of that power.
Clause 53 applies where the Secretary of State revises and republishes the DVS trust framework to include a new rule or to change an existing rule and specifies in the trust framework that a top-up certificate will be required to show compliance with the new rule from a specified date.
I hope that what I have set out is reasonably clear, and on that basis I ask that clauses 48 to 53 stand part of the Bill.
As has been mentioned, a publicly available register of trusted digital verification services is welcome; as a result, so is this set of clauses. A DVS register of this kind will improve transparency for anyone wanting to use a DVS service, as they will be able to confirm easily and freely whether the organisation that they hope to use complies with the trust framework.
However, the worth of the register relies on the worth of the trust framework, because only by getting the trust framework right will we be able to trust those that have been accredited as following it. That will mean including enough in the framework to assure the general public that their rights are protected by it. I am thinking of things such as data minimisation and dispute resolution procedures. I hope that the Department will consider embedding principles of data rights in the framework, as has been mentioned.
As with the framework, the detail of these clauses will come via secondary legislation, and careful attention must be paid to the detail of those measures when they are laid before Parliament. In principle, however, I have no problem with the provisions of the clauses. It seems sensible to enable the Secretary of State to determine a fee for registration, to remove a person from the register upon a change in circumstances, or to remove an organisation if it is failing to comply with the trust framework. Those are all functions that are essential to the register functioning well, although any fees should of course be proportionate to keep market barriers low and ensure that smaller players continue to have access. That facilitates competition and innovation.
Similarly, the idea of top-up certificates seems sensible. Members on both sides of the House have agreed at various points on the importance of future-proofing a Bill such as this, and the digital verification services framework should have space for modernisation and adaptation where necessary. Top-up certificates will allow for the removal of any organisation that is already registered but fails to comply with new rules added to the framework.
The detail of these provisions will be analysed as and when the regulations are introduced, but I will not object to the principle of an accessible and transparent register of accredited digital verification services.
I thank the Minister for clarifying the role of the office for digital identities and attributes. Some of the comments I made on clause 46 are probably more applicable here, but I will not repeat them, as I am sure the Committee does not want to hear them a second time. However, I ask the Minister to clarify the process. If a company objects to not being approved for registration or says that it has followed the process set out by the Secretary of State but the Secretary of State does not agree, or if a dispute arises for whatever reason, what appeal process is there, if any, and who is responsible for resolving disputes? That is just one example of the clarity that is necessary for an office of this kind.
Will the Minister clarify the dispute resolution process and whether the office for digital identities and attributes will have a regulatory function? Given the lack of detail on the office, I am concerned about whether it will have the necessary powers and resources. How many people does the Minister envisage working for it? Will they be full-time employees of the office, or will they be job sharing with other duties in his Department?
My other questions are about something I raised earlier, to which the Minister did not refer: international co-operation and regulation. I imagine there will be instances where companies headquartered elsewhere want to offer digital verification services. Will there be compatibility issues with digital verification that is undertaken in other jurisdictions? Is there an international element to the office for digital identities and attributes?
Everyone on the Committee agrees that this is a very important area, and it will only get more important as digital verification becomes even more essential for our everyday working lives. What discussions is the Minister having with the Department for Business and Trade about the kind of market that we might expect to see in digital verification services and ensuring that it is competitive, diverse and across our country?
I look forward to debating the detail of the framework with the hon. Member for Barnsley East when it comes forward, but the hon. Member for Newcastle upon Tyne Central raised a couple of specific points. As I said, the new office for digital identities and attributes will be in the Department for Science, Innovation and Technology, and it will work on a similar basis to that of the office for product safety and standards, which operates within the Department for Business and Trade.
However, I should make it clear that the office for digital identities and attributes is not a regulator, because the use of digital identities is not mandatory, so it does not have investigatory or enforcement powers. It is not our intention for it to be able to levy fines or resolve individual complaints. Further down the line, as the market develops, it may be decided that it should be housed permanently in an independent body or as an arm’s length body, but that is for consideration in due course. It will start off within the Department.
I will come back to the hon. Member for Newcastle upon Tyne Central with more detail about dispute resolution. I take her point; I am not sure how often what she describes is likely to happen, but clearly it is sensible at least to take account of it.
With this it will be convenient to discuss the following:
Clauses 55 and 56 stand part.
Government amendments 6 and 7.
Government new clause 3—Information disclosed by the Welsh Revenue Authority.
Government new clause 4—Information disclosed by Revenue Scotland.
Clause 54 creates a permissive power to enable public authorities to share information relating to an individual with registered digital verification service providers. That the power is permissive means that public authorities are not under any obligation to disclose information. The power applies only where a digital verification service provider is registered in the DVS register and the individual has requested the digital verification service from that provider. Information disclosed using the power does not breach any duty of confidentiality or other restrictions relating to the disclosure of information, but the power does not enable the disclosure of information if disclosure would breach data protection legislation. The clause also gives public authorities the power to charge fees for disclosing information.
All information held by His Majesty’s Revenue and Customs is subject to particular statutory safeguards relating to confidentiality. Clause 55 establishes particular safeguards for information disclosed to registered digital verification service providers by His Majesty’s Revenue and Customs under clause 54. The Government will not commence measures to enable the disclosure of information held by HMRC until the commissioners for HMRC are satisfied that the technology and processes for information sharing uphold the particular safeguards relating to taxpayer confidentiality and therefore allow information sharing by HMRC to occur without adverse effect on the tax system or any other functions of HMRC.
Clause 56 obliges the Secretary of State to produce and publish a code of practice about the disclosure of information under clause 54. Public authorities must have regard to the code when disclosing information under this power. Publication of the first version of the code is subject to the affirmative resolution procedure. Publication of subsequent versions of the code is subject to the negative resolution procedure. We will work with the commissioners for HMRC to ensure that the code meets the needs of the tax system.
New clauses 3 and 4 and Government amendments 6 and 7 establish safeguards for information that reflect those already in the Bill under clause 55 for HMRC. Information held by tax authorities in Scotland and Wales—Revenue Scotland and the Welsh Revenue Authority—is subject to similar statutory safeguards relating to confidentiality. These safeguards ensure that confidence and trust in the tax system is maintained. Under these provisions, registered DVS providers may not further disclose information provided by Revenue Scotland or the Welsh Revenue Authority unless they have the consent of that revenue authority to do so. The addition of these provisions will provide an equivalent level of protection for information shared by all three tax authorities in the context of part 2 of the Bill, avoiding any disparity in the treatment of information held by different tax authorities in this context. A similar provision is not required for Northern Irish tax data, as HMRC is responsible for the collection of devolved taxes in Northern Ireland.
Many digital verification services will, to some extent, rely on public authorities being able to share information relating to an individual with an organisation on the DVS register. To create a permissive gateway that allows this to happen, as clause 54 does, is therefore important for the functioning of the entire DVS system, but there must be proper legal limits placed on these disclosures of information, and as ever, any disclosures involving personal data must abide by the minimisation principle, with only the information necessary to verify the person’s identity or the fact about them being passed on. As such, it is pleasing to see in clause 54 the clarification of some of those legal limits, as contained in the likes of data protection legislation and the Investigatory Powers Act 2016. Similarly, clause 55 and the Government new clauses apply the necessary limits on sharing of personal data from HMRC and devolved revenue authorities under clause 54.
Finally, clause 56, which seeks to ensure that a code of practice is published regarding the disclosure of information under clause 54, will be a useful addition to the previous clauses and will ensure that the safety of such disclosures is properly considered in comprehensive detail. The Information Commissioner, with their expertise, will be well placed to help with this, so it is pleasing to see that they will be consulted during the process of designing this code. It is also good to see that this consultation will be able to occur swiftly—before the clause even comes into force—and that the resulting code will be laid before both Houses.
In short, although some disclosures of personal data from public authorities to organisations providing DVS are inevitable, as they are necessary for the very functioning of a verification service, careful attention should be paid to how this is done safely and legally. These clauses, alongside a well-designed framework—as already discussed—will ensure that that is the case.
Question put and agreed to.
Clause 54 accordingly ordered to stand part of the Bill.
Clauses 55 and 56 ordered to stand part of the Bill.
Clause 57
Trust mark for use by registered persons
Question proposed, That the clause stand part of the Bill.
Clause 57 makes provision for the Secretary of State to designate a trust mark to a DVS provider. The trust mark is essentially a kitemark that shows that the provider complies with the rules and standards set out in the trust framework, and has been certified by an approved conformity assessment body. The trust mark must be published by the Secretary of State and can only be used by registered digital verification service providers. The clause gives the Secretary of State powers to enforce that restriction in civil proceedings.
Trust marks are useful tools that allow organisations and the general public alike to immediately recognise whether or not a product or service has passed a certain testing standard or criterion. This is especially the case online, where due to misinformation and the prevalence of scams such as phishing, trust in online services can be lower than in the physical world.
The TrustedSite certification, for example, offers online businesses an earned certification programme that helps them to demonstrate that they are compliant with good business practices and maintain high safety standards. This is a benefit not only to the business itself, which is able to convert more users into clicks and sales, but to the users, who do not have to spend time researching each individual business and can explore pages and shop with immediate certainty. A trust mark for digital verification services would serve a similar purpose, enabling certified organisations that meet the trust framework criteria to be immediately recognisable, offering them the opportunity to be used by more people and offering the public assurance that their personal data is being handled by a verified source.
Of course, as is the case with this entire section of the Bill, the trust mark is only worth as much as the framework around it. Ministers should again think carefully about how to ensure that the framework supports the rights of the individual. Furthermore, the trust mark is useful only if people recognise it; otherwise, it cannot provide the immediate reassurance that it is supposed to. When the trust mark is established, what measures will the Department take to raise public awareness of it? In the same vein, to know the mark’s value, the public must also be aware of the trust framework that the mark is measured against, so what further steps will the Department take to increase knowledge and understanding of digital verification services and frameworks? Finally, will the Department publish the details of any identified unlawful use of the trust mark, so that public faith in the reliability of the trust mark remains high?
Overall, the clause is helpful in showing that we take seriously the need to ensure that people do not use digital verification services that may mishandle their data.
I am grateful to the hon. Lady for her support. I entirely take her point that a trust mark only really works if people know what it is and can look for it when seeking a DVS provider.
Regarding potential abuse, obviously that is something we will monitor and potentially publicise in due course. All I would say at this stage is that she raises valid points that I am sure we will consider as the new system is implemented.
Question put and agreed to.
Clause 57 accordingly ordered to stand part of the Bill.
Clause 58
Power of Secretary of State to require information
Amendments made: amendment 6, in clause 58, page 84, line 5, after “55” insert
“or (Information disclosed by the Welsh Revenue Authority)”
This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC3.
Amendment 7, in clause 58, page 84, line 5, after “55” insert
“or (Information disclosed by Revenue Scotland)”—(Sir John Whittingdale.)
This amendment prevents the Secretary of State requesting a disclosure of information which would contravene the new clause inserted by NC4.
Question proposed, That the clause, as amended, stand part of the Bill.
Clauses 58 to 60 set out powers and duties conferred upon the Secretary of State in relation to the exercise of her governance and oversight functions under part 2.
Clause 58 enables the Secretary of State to issue a written notice that requires accredited conformity assessment bodies or registered DVS providers to provide information reasonably required by the Secretary of State to exercise functions under part 2. The notice must state why the information is required. It may also state what information is required, the form in which it should be provided, when it should be provided and the place to which it should be provided. Any notice given to a provider must also inform the provider that they may be removed from the DVS register if they fail to comply with the notice.
The power is subject to certain safeguards. Information does not have to be disclosed if to do so would breach clause 55 in relation to HMRC data or data protection legislation, or if disclosure is prohibited by the relevant parts of the Investigatory Powers Act 2016. Information does not need to be disclosed if doing so would reveal an offence that would expose a person to criminal proceedings. That does not apply to offences mentioned relating to false statements.
Clause 59 gives the Secretary of State the power to make regulations specifying that another person is able to exercise her functions under part 2. This clause enables us to move the governance and oversight functions of the Secretary of State to a third party if appropriate.
I thank the Minister for giving way. Before he moves on to clause 60, can he set out, perhaps giving an example, where it might be appropriate to use the power in clause 59 to make arrangements for another person to take on these functions, or in what circumstances he envisages it being used?
We are obviously at a very early stage in the development of this market. At the moment, it is felt right that oversight should rest with the Secretary of State, but it may be that as the market grows and develops there will need to be the oversight via a separate body. The clause keeps the power available to the Secretary of State to delegate the function if he or she chooses to do so.
Clause 60 requires the Secretary of State to publish an annual report on the functioning of this part. The first report must be published within 12 months of clause 47, the DVS trust framework clause, coming into force. The reports will help to ensure that the market continues to meet the needs of DVS providers, public authorities, regulators, civil society and individuals. I commend the clauses to the Committee.
To oversee the DVS register, it is understandable that the Secretary of State may in some cases need to require information from registered bodies to ensure that they are complying with their duties under the framework. It is good that clause 58 provides for that power, and places reasonable legal limits on it, so that disclosures of information do not disrupt legal professional privilege or other important limitations. Likewise, it is sensible that the Secretary of State be given the statutory power to delegate some oversight of the measures in this part in a paid capacity, as is ensured by clause 59.
As I have mentioned many times throughout our scrutiny of the Bill, the Secretary of State may not always have the level of expertise needed to act alone in exercising the powers given to them by such regulations. The input of those with experience and time to commit to ensuring the quality of the regulations will therefore be vital to the success of these clauses. Again, however, we will need more information about the establishment of the OfDIA and the governance of digital identities overall to be able to interpret fully both the delegated powers and the power to require information, and how they will be used. Once again, therefore, I urge transparency from the Government as those governance structures emerge.
That leads nicely to clause 60, which requires the Secretary of State to prepare and publish yearly reports on the operation of this part. A report of that nature will offer the chance to periodically review the functioning of the trust framework, register, trust mark and all other provisions contained in this part, thereby providing an opportunity to identify and rectify any recurring issues that the system may face. That is sensible for any new project, particularly one that, through its transparency, will offer accountability of the Government to the general public, who will be able to read the published reports. In short, there are no major concerns regarding any of the three clauses, though further detail on the governance of digital identities services will need proper scrutiny.
Question put and agreed to.
Clause 58 accordingly ordered to stand part of the Bill.
Clauses 59 and 60 ordered to stand part of the Bill.
Clause 61
Customer data and business data
I beg to move amendment 46, in clause 61, page 85, line 24, after “supplied” insert “or provided”.
The definition of “business data” in clause 61 refers to the supply or provision of goods, services and digital content. For consistency with that, this amendment amends an example given in the definition so that it refers to what is provided, as well as what is supplied.
We move on to part 3 of the Bill, concerning smart data usage, which I know is of interest to a number of Members. Before I discuss the detail of clause 61 and amendment 46, I will give a brief overview of this part and the policy intention behind it. The provisions in part 3 allow the Secretary of State or the Treasury to make regulations that introduce what we term “schemes” that compel businesses to share data that they hold on customers with the customer or authorised third parties upon the customer’s request, and to share or publish data that they hold about the services or products that they provide. Regulations under this part will specify what data is in scope within the parameters set out by the clauses, and how it should be shared.
The rest of the clauses in this part permit the Secretary of State or the Treasury to include in the regulations the measures that will underpin these data sharing schemes and ensure that they are subject to proper safeguards—for example, relating to the enforcement of regulations; the accreditation of third party businesses wanting to facilitate data sharing; and how these schemes can be funded through levies and charging. Regulations that introduce schemes, or significantly amend existing schemes, will be subject to prior consultation and parliamentary approval through the affirmative procedure.
The policy intention behind the clauses is to allow for the creation of new smart data schemes, building on the success of open banking in the UK. Smart data schemes establish the secure sharing of customer data and contextual information with authorised third parties on the customer’s request. The third parties can then be authorised by the customer to act on their behalf. The authorised third parties can therefore provide innovative services for the customer, such as analysing spending to identify cost savings or displaying data from multiple accounts in a single portal. The clauses replace existing regulation-making powers relating to the supply of customer data in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013; those powers are not sufficient for new smart data schemes to be effective.
Clause 61 defines the key terms and concepts for the powers in part 3. We have tabled a minor Government amendment to the clause, which I will explain. The definitions of data holder and trader in subsection (2) explain who may be required to provide data under the regulations. The definitions of customer data and business data deal with the two kinds of data that suppliers may be required to provide. Customer data is information relating to the transactions between the customer and supplier, such as a customer’s consumption of the relevant good or service and how much the customer has paid. Business data is wider contextual data relating to the goods or services supplied or provided by the relevant supplier. Business data may include standard prices, charges or tariffs and information relating to service performance. That information may allow customers to understand their customer data. Government amendment 46 clarifies that a specific example of business data—information about location—refers to the supply or provision of goods or services. It corrects a minor inconsistency in the list of examples of business data in subsection (2)(b).
Subsection (3) concerns who is a customer of the supplying trader, and who can therefore benefit from smart data. Customers may include both consumers and businesses. Subsection (4) enables customers to exercise smart data rights in relation to contracts they have already entered into, and subsection (5) allows the schemes to function through provision of access to data, as opposed to sending data as a one-off transfer.
The clause defines key terms in this part of the Bill, such as business data, customer data and data holder, as well as data regulations, customer and trader. These are key to the regulation-making powers on smart data in part 3, and I have no specific concerns to raise about them at this point.
I note the clarification made by the Minister in his amendment to the example given. As he outlined, that will ensure there is consistency in the definition and understanding of business data. It is good to see areas such as that being cleaned up so that the Bill can be interpreted as easily as possible, given its complexity to many. I am therefore happy to proceed with the Bill.
I rise to ask the Minister a specific question about the use of smart data in this way. A lot of users will be giving away data a device level, rather than just accessing individual accounts. People are just going to a particular account they are signed into and making transactions, or doing whatever they are doing in that application, on a particular device, but there will be much more gathering of data at the device level. We know that many companies—certainly some of the bigger tech companies—use their apps to gather data not just about what their users do on their particular app, but across their whole device. One of the complaints of Facebook customers is that if they seek to remove their data from Facebook and get it back, the company’s policy is to give them back data only for things they have done while using its applications—Instagram, Facebook or whatever. It retains any device-level data that it has gathered, which could be quite significant, on the basis of privacy—it says that it does not know whether someone else was using the device, so it is not right to hand that data back. Companies are exploiting this anomaly to retain as much data as possible about things that people are doing across a whole range of apps, even when the customer has made a clear request for deletion.
I will be grateful if the Minister can say something about that. If he cannot do so now, will he write to me or say something in the future? When considering the way that these regulations work, particularly in the era of smart data when it will be far more likely that data is gathered across multiple applications, it should be clear what rights customers have to have all that data deleted if they request it.
I share my hon. Friend’s general view. Customers can authorise that their data be shared through devices with other providers, so they should equally have the right to take back that data if they so wish. He invites me to come back to him with greater detail on that point, and we would be very happy to do so.
Amendment 46 agreed to.
Clause 61, as amended, ordered to stand part of the Bill.
Clause 62
Power to make provision in connection with customer data
I beg to move amendment 112, in clause 62, page 87, line 2, at end insert—
“(3A) The Secretary of State or the Treasury may only make regulations under this section if—
(a) the Secretary of State or the Treasury has conducted an assessment of the impact the regulations may have on customers, businesses, or industry,
(b) the assessment mentioned in paragraph (a) has been published, and
(c) the assessment concludes that the regulations achieve their objective without imposing disproportionate, untargeted or unnecessary cost on customers or businesses.”
With this it will be convenient to discuss the following:
Amendment 113, in clause 62, page 87, line 12, at end insert—
“(5) The Secretary of State or the Treasury may invite a relevant sectoral regulator to contribute to, or to conduct, any impact assessment conducted in order to enable the Secretary of State or the Treasury to fulfil their obligation under subsection (4).”
This amendment would allow the Secretary of State or the Treasury to enable a relevant sectoral regulator to contribute to, or conduct, any impact assessments on smart data regulations.
Amendment 114, in clause 62, page 87, line 12, at end insert—
“(5) The Secretary of State or the Treasury must consult representatives of the relevant business or industry sector to inform their decision whether to make regulations under this section.”
This amendment would require the Secretary of State or the Treasury to consult representatives of the relevant business or industry sector before making smart data regulations.
Amendment 115, in clause 62, page 87, line 12, at end insert—
“(5) Within six months of the passage of this Act, the Secretary of State must—
(a) publish a target date for the coming into force of the first regulations under this section, and
(b) make arrangements for the completion of an assessment of the impact of those regulations.”
This amendment would require Government to identify a target for a first smart data scheme within 6 months, and make arrangements for an impact assessment for these regulations.
Of all the provisions in the Bill, the ones on smart data are those that I am most excited about and pleased to welcome. The potential of introducing smart data schemes is immense: they can bring greater choice to consumers, enable innovation, increase competition and result in the delivery of better products and services. I will address amendments 112 and 113, but I look forward to the opportunity to speak in support of this part more widely.
Most of the detail on how and where smart regimes will be regulated in practice through this Bill will follow in secondary legislation and regulation. That is deliberate and welcome, as it ensures that smart data schemes are built around the realities of the sectors to which they apply. Given that they cannot be included on the face of the Bill, however, it is important that the regulations are prepared in the way that any good data-related law is. There must be a committee of consultation to ensure that the outcome works effectively for consumers and businesses, with the appropriate data protection safeguards.
Indeed, there may be certain sectors in which the costs simply outweigh the benefits of introducing such a regime. Sky believes that there is currently no evidence that a smart data scheme in the communications sector would bring clear and tangible additional benefits to customers. Ofcom consulted on the proposal in 2020 and came to a similar conclusion. Sky argues that the communications sector already has
“a very high bar for supporting consumers to use data to find the best deal for them. For example, in 2020 Ofcom introduced End of Contract Notifications”,
which tell customers when their current contract is ending and what they could save by signing up to another deal. Sky says that Ofcom is
“also in the process of introducing One Touch Switching for fixed broadband which will make it easier for customers to move between providers who operate on different networks”.
As BT identifies, smart data initiatives require significant time and investment to implement. The Government’s impact assessment estimates that the implementation cost for the telecoms sector for a smart data initiative could be anywhere between £610 million and £732 million. That is not to say that the cost outweighs the potential benefits for all industries, including telecoms, but it is important that the Government weigh that up before making any regulations, particularly given that large costs be passed on to consumers, or that there may be less investment in other areas. In the telecoms industry, it could lead to a reduction in investment in full-fibre broadband and 5G. It is imperative, therefore, to ensure that all costs remain targeted, proportionate and necessary to bring about an overall benefit that outweighs the costs. An impact assessment would provide assurance that this has been taken into consideration before any new schemes are introduced.
When conducting such an assessment, sectoral regulators, which can provide expert insight into the impact of smart data in any particular industry, will be well placed to assess the costs and benefits in the detail needed. That is something the Government themselves recognise, as they have placed a requirement in the Bill to consult those regulators. The amendments I propose would strengthen that commitment, allowing relevant sectoral regulators the opportunity, where appropriate, to be formally involved in the process of conducting an impact assessment.
I assure the hon. Lady that I and, no doubt, the whole Committee share her excitement about the potential offered by smart data, and I have sympathy for the intention behind her amendments. However, taking each one in turn, we feel amendment 112 is unnecessary because the requirements are already set by the better regulation framework, the Small Business, Enterprise and Employment Act 2015 and, indeed, these clauses. Departments will conduct an impact assessment in line with the better regulation framework and Green Book guidance when setting up a new smart data scheme, and must demonstrate consideration of their requirements under the Equality Act 2010. That will address the proportionality, targeting and necessity of the scheme.
Moreover, the clauses require the Government to consider the effect of the regulations on matters including customers, businesses and competition. An impact assessment would be an effective approach to meeting those requirements. However, there is a risk that prescribing exactly how a Department should approach the requirements could unnecessarily constrain the policymaking process.
I turn to amendment 113. Clause 74(5) already requires the Secretary of State or the Treasury to consult with relevant sector regulators as they consider appropriate. As part of the process, sector regulators may be asked to contribute to the development of regulatory impact assessments, so we do not believe the amendment is necessary.
On amendment 114, we absolutely share the view of the importance of Government consulting businesses before making regulations. That is why, under clause 74(6), the Secretary of State or the Treasury must, when introducing a smart data scheme, consult such persons as are likely to be affected by the regulations and such sectoral regulators as they consider appropriate. Those persons will include businesses relevant to the envisaged scheme.
On amendment 115, we absolutely share the ambition to grab whatever opportunities smart data offers. In particular, I draw the hon. Lady’s attention to the commitments made last month by the Economic Secretary to the Treasury, who set out the Treasury’s plans to use the smart data powers to provide open banking with a sustainable regulatory framework, while the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), chaired the inaugural meeting of the Smart Data Council last month. That council has been established to support and co-ordinate the development of smart data schemes in a timely manner.
With respect to having a deadline for schemes, we should recognise that implementation of the regulations requires careful consideration. The hon. Member for Barnsley East clearly recognises the importance of consultation and of properly considering the impacts of any new scheme. We are committed to that, and there is a risk that a statutory deadline for making the regulations would jeopardise our due diligence. I assure her that all her concerns are ones that we share, so I hope that she will accept that the amendments are unnecessary.
I am grateful to the Minister for those assurances. I am reassured by his comments, and I am happy to beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
Clause 62 provides the principal regulation-making power to establish smart data schemes in relation to customer data. The clause enables the Secretary of State or the Treasury to make regulations that require data holders to provide customer data either directly to a customer, or to a person they have authorised, at their request. Subsection (3) of the clause also allows for an authorised person who receives the customer data, to exercise the customer’s rights in relation to their data on their behalf. We call that “action initiation”.
An illustrative example could be in open banking, where customers can give authorised third parties access to their data to compare the consumer’s current bank account with similar offers, or to group the contracts within a household together for parents or guardians to better manage children’s accounts. Subsection (3) could allow the authorised third party to update the customer’s contact details across the associated accounts, for example if an email address changes.
Clause 63 outlines the provisions that smart data scheme regulations may contain when relating to customer data. The clause establishes much of the critical framework that smart data schemes will be built on. On that basis, I commend clauses 62 and 63 to the Committee.
As previously mentioned, and with the caveats that I expressed when I was discussing my amendments, I am extremely pleased to be able to welcome this part of the Bill. In essence, clauses 62 and 63 enable regulations that will allow for customer data to be provided to a third party on request. I will take the opportunity to highlight why that is the case by looking at some of the benefits that smart data can provide.
Since 2018, open banking—by far the most well known and advanced version of smart data in operation—has demonstrated what smart data can deliver over and over again. For the wider economy, the benefits have been remarkable, with the total value to the UK economy now amounting to more than £4.1 billion, according to Coadec, the Coalition for a Digital Economy. Consumers’ experience of banking has been revolutionised if they have consented of their own accord to have third-party applications access their financial data.
Indeed, a whole host of money management tools and apps can now harness people’s financial data to create personalised recommendations based on their spending habits, including how to budget or save. During a cost of living crisis, some of those tools have been extremely valuable in helping people to manage new bills and outgoings. Furthermore, online retailers can now connect directly to someone’s bank so that, rather than spending the time filling in their card details each time they make a purchase, an individual can approve the transaction via their online banking system.
It is important to reiterate that open banking is based on consent, so consumers participate only if they feel it is right for them. As it happens, millions of people have capitalised on the benefits. More than seven million consumers and 50% of small and medium-sized enterprises have used open banking services to gain a holistic view of their finances, to support applications for credit and to pay securely, quickly and cheaply.
Though open banking has brought great success for both consumers and the wider economy, it is also important that the Government learn lessons from its implementation. We must pay close attention to how the introduction of open banking has impacted both the industry and consumers and ensure that any takeaways are factored in when considering an expansion of smart data into new industries.
Further, given that the Government clearly recognise the value of open data, as shown by this section of the Bill, it is a shame that the Bill does not go further in exploring the possibilities of opening datasets in other settings. Labour has explicitly set out to do that in its industrial strategy. For example, we have identified that better, more open datasets on jobs could help us to understand where skills shortages are, allowing jobseekers, training providers and Government to better fill those gaps.
The provisions in clauses 62 and 63 to create new regimes of smart data are therefore welcome, but the Bill unfortunately remains a missed opportunity to fully capitalise on the opportunities of open, secure data flows.
Question put and agreed to.
Clause 62 accordingly ordered to stand part of the Bill.
Clause 63 ordered to stand part of the Bill.
Clause 64
Power to make provision in connection with business data
Question proposed, That the clause stand part of the Bill.
Clause 64 provides the principal regulation-making power for the creation of smart data schemes relating to business data. Regulations created through this clause allow for business data to be provided to the customer of a trader or a third-party recipient. Business data may also be published to be more widely available.
These regulations relating to business data will increase the transparency around the pricing of goods and services, which will increase competition and benefit both consumers and smaller businesses. To give just one example, the Competition and Markets Authority recently highlighted the potential of an open data scheme that compared the prices of fuel at roadside stations, increasing competition and better informing consumers. It is that kind of market intervention that the powers provide for.
Clause 65 outlines provisions that regulations relating to business data may contain. Those provisions are non-exhaustive. The clause largely mirrors clause 63, extending the same protections and benefits to schemes that make use of businesses data exclusively or in tandem with customer data. The clause differs from clause 63 in subsection (2), where an additional consideration is made as to who may make a request for business data. As action initiation relates only to an authorised person exercising a customer’s rights relating to their data, clause 65 does not include the references to that that are made in subsections (7) and (8) of clause 63.
The measures in these clauses largely mirror 62 and 63, but they refer to business data rather than customer data. I therefore refer back to my comments on clause 62 and 63 and the benefits that new regulations such as these might be able to provide. Those remarks provide context as to why I am pleased to support these measures, which will allow the making of regulations that require data holders to share business data with third parties.
However, I would like clarification from the Minister on one point. The explanatory notes explain that the powers will likely be used together with those in clauses 62 and 63, but it would be good to hear confirmation from the Minister on whether there may be circumstances in which the Department envisages using the powers regarding business data distinctly. If there are, will he share examples of those circumstances? It would be good for both industry and Members of this House to have insight into how these clauses, and the regulatory powers they provide, will actually be used.
I think it is probably sensible if I come back to the hon. Lady on that point. I am sure we would be happy to provide examples if there are ones that we can identify.
Question put and agreed to.
Clause 64 accordingly ordered to stand part of the Bill.
Clause 65 ordered to stand part of the Bill.
Clause 66
Decision-makers
Clauses 66 to 72 contain a number of provisions that will allow smart data regulations to function effectively. They are provisions on decision makers who approve and monitor third parties that can access the data, provisions on enforcement of the regulations and provisions on the funding of smart data schemes. It is probably sensible that I go through each one in more detail.
Clause 66 relates to the appointment of persons or accrediting bodies referred to as decision makers. The decision makers may approve the third parties that can access customer and business data, and act on behalf of customers. The decision makers may also revoke or suspend their accreditation, if that is necessary. An accreditation regime provides certainty about the expected governance, security and conduct requirements for businesses that can access data. Customers can be confident their chosen third party meets an appropriate standard. Clause 66 allows the decision maker to monitor compliance with authorisation conditions, subject to safeguards in clause 68.
Clause 67 enables regulations to confer powers of enforcement on a public body. The public body will be the enforcer, responsible for acting upon any breaches of the regulations. We envisage that the enforcer for a smart data scheme is likely to be an existing sectoral regulator, such as the Financial Conduct Authority in open banking. While the clause envisages civil enforcement of the regulations, subsection (6) allows for criminal offences in the case of falsification of information or evidence. Under subsections (3) and (10), the regulations may confer powers of investigation on the enforcer. That may include powers to require the provision of information and powers of entry, search and seizure. Those powers are subject to statutory restrictions in clause 68.
Clause 68 contains provisions limiting the investigatory powers given to enforcers. The primary restriction is that regulations may not require a person to give an enforcer information that would infringe the privileges of Parliament or undermine confidentiality, legal privilege and, subject to the exceptions in subsection (7), privilege against self-incrimination. Subsection (8) prevents any written or oral statement given in response to a request for information in the course of an investigation from being used as evidence against the person being prosecuted for an offence, other than that created by the data regulations.
Clause 69 contains provisions relating to financial penalties and the relevant safeguards. It sets out what regulations must provide for if enabling the use of financial penalties. Subsection (2) requires that the amount of a financial penalty is specified in, or determined in accordance with, the regulations. For example, the regulations may set a maximum financial penalty that an enforcer can impose and they may specify the methodology to be used to determine a specific financial penalty.
Clause 70 enables actors in smart data schemes to require the payment of fees. The circumstances and conditions of the fee charging process will be specified in the regulations. The purpose of the clause, along with clause 71, is to seek to ensure that the costs of smart data schemes, and of bodies exercising functions under them, can be met by the relevant sector.
It is intended that fees may be charged by accrediting bodies and enforcers. For example, regulations could specify that an accrediting body may charge third parties to cover the cost of an accreditation process and ongoing monitoring. Enforcers may also be able to charge to cover or contribute to the cost of any relevant enforcement activities. The regulations may provide for payment of fees only by persons who are directly affected by the performance of duties, or exercise of powers, under the regulations. That includes data holders, customers and those accessing customer and business data.
Clause 71 will enable the regulations to impose a levy on data holders or allow a specified public body to do so. That is to allow arrangements similar to those in section 38 of the Communications Act 2003, which enables the fixing of charges by Ofcom. Together with the provision on fees, the purpose of the levy is to meet all or part of the costs incurred by enforcers and accrediting bodies, or persons acting on their behalf. The intention is to ensure that expenses can be met without incurring a cost to the taxpayer. Levies may be imposed only in respect of data holders that appear to be capable of being directly affected by the exercise of the functions.
Clause 72 provides statutory authority for the Secretary of State or the Treasury to give financial assistance, including to accrediting bodies or enforcers. Subsection (2) provides that the assistance may be given on terms and conditions that are deemed appropriate by the regulation maker. Financial assistance is defined to include both actual or contingent assistance, such as a grant, loan, guarantee or indemnity. It does not include the purchase of shares. I commend clauses 66 to 72 to the Committee.
Clauses 66 to 72 provide for decision makers and enforcers to help with the operation and regulation of new smart data regimes. As was the case with the digital verification services, where I agreed that there was a need for the Secretary of State to have limited powers to ensure compliance with the trust framework, powers will be needed to ensure that any regulations made under this part of the Bill are followed. The introduction in clause 67 of enforcers—public bodies that will, by creating fines, penalties and notices of compliance, ensure that organisations follow regulations made under part 3—is therefore welcome.
As ever, it is pleasing to see that the relevant restrictions on the powers of enforcers are laid out in clause 68, to ensure that they cannot infringe upon other, more fundamental rights. It is also right, as is ensured by clause 69, that there are safeguards on the financial penalties that an enforcer is able to issue. Guidance on the amount of any penalties, as well as a formalised process for issuing notices and allowing for appeal, will provide uniformity across the board so that every enforcer acts proportionately and consistently.
Decision makers allowed for by clause 66 will be important, too, in conjunction with enforcers. They will ensure there is sufficient oversight of the organisations that are enabled to have access to customer or business data through any particular smart data regimes. Clauses 70, 71 and 72, which finance the activities of decision makers and enforcers, follow the trend of sensible provisions that will be required if we are to have confidence that regulations made under this part of the Bill will be adhered to. In short, the measures under this grouping are largely practical, and they are necessary to support clauses 62 to 65.
Question put and agreed to.
Clause 66 accordingly ordered to stand part of the Bill.
Clauses 67 to 72 ordered to stand part of the Bill.
Clause 73
Confidentiality and data protection
Question proposed, That the clause stand part of the Bill
Clauses 73 to 77 relate to confidentiality and data protection; various provisions connected with making the regulations, including consultation, parliamentary scrutiny and a duty to conduct periodic reviews of regulations; and the repeal of the existing regulation-making powers that these clauses replace.
Clause 73(1) allows the regulations to provide that there are no contravening obligations of confidence or other restrictions on the processing of information. Subsection (2) ensures that the regulations do not require or authorise processing that would contravene the data protection legislation. The provisions are in line with the approach taken towards pension dashboards, which are electronic communications services that allow individuals to access information about their pensions.
Clause 74(1) allows the regulation-making powers to be used flexibly. Subsection (1)(f) allows regulations to make provision by reference to specifications or technical requirements. That is essential to allow for effective and safe access to customer data, for instance the rapid updating of IT and security requirements, and it mirrors the powers enacted in relation to pensions dashboards, which I have mentioned. Clause 74(2) provides for limited circumstances in which it may be necessary for regulations to modify primary legislation to allow the regulations to function effectively. For instance, it may be necessary to extend a statutory alternative dispute resolution scheme in a specific sector to cover the activities of a smart data scheme.
Clause 74(3) states that affirmative parliamentary scrutiny will apply to the first regulations made under clauses 62 or 64; that is, affirmative scrutiny will apply to regulations that introduce a scheme. Affirmative parliamentary scrutiny will also be required where primary legislation is modified, where regulations make requirements more onerous for data holders and where the regulations confer monitoring or enforcement functions or make provisions for fees or a levy. Under clause 74(5), prior to making regulations that will be subject to affirmative scrutiny, the Secretary of State or the Treasury must consult persons who are likely to be affected by the regulations, and relevant sectoral regulators, as they consider appropriate.
The Government recognise the importance of enabling the ongoing scrutiny of future regulations, so clause 75 requires the regulation maker to review the regulations at least at five-yearly intervals. Clause 76 repeals the regulation-making powers in sections 89 to 91 of the Enterprise and Regulatory Reform Act 2013, which are no longer adequate to enable the introduction of effective smart data schemes. Those sections are replaced by the clauses in part 3 of the Bill. Clause 77 defines, or refers to definitions of, terms used in part 3 and is essential to the functioning and clarity of part 3. I commend the clauses to the Committee.
Many of the clauses in this grouping are supplementary to the provisions that we have already discussed, or they provide clarification as to which regulations under part 3 are subject to parliamentary scrutiny. I have no further comments to add on the clauses, other than to welcome them as fundamental to the wider part. However, I specifically welcome clause 75, which requires that the regulations made under this part be periodically reviewed at least every five years.
I hope that such regulations will be under constant review on an informal basis to assess how well they are working, but it is good to see a formal mechanism to ensure that that is the case over the long term. It would have been good, in fact, to see more such provisions throughout the Bill, to ensure that regulations that are made under it work as intended. Overall, I hope it is clear that I am very supportive of this part’s enabling of smart data regimes. I look forward to it coming into force and unlocking the innovation and consumer benefits that such schemes will provide.
Question put and agreed to.
Clause 73 accordingly ordered to stand part of the Bill.
Clause 74 to 77 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesOn a point of order, Mr Paisley. I would like to correct the record regarding my comments on clause 13, which appear in column 148 of the Committee proceedings in Hansard for Tuesday 16 May. I referred to the views of Lexology and included a quote, which I attributed to that organisation, when in fact the views and quote in question were those of an organisation named Prighter, which were simply published by Lexology.
I beg to move amendment 5, in clause 78, page 100, line 30, after “86” insert “and [Codes of conduct]”.
This amendment is consequential on NC2.
With this it will be convenient to discuss Government new clause 1 and Government new clause 2.
It is a pleasure to serve under your chairmanship, Mr Paisley. Welcome to the Committee.
The Privacy and Electronic Communications (EC Directive) Regulations 2003 place specific requirements on organisations in relation to use of personal data in electronic communications. They include, for example, rules on the use of emails, texts and phone calls for direct marketing purposes and the use of cookies and similar technologies.
Trade associations have told us that sometimes their members need guidance on complying with the legislation that is more bespoke than the general regulatory guidance from the Information Commissioner’s Office. New clause 2 will allow representative bodies to design codes of conduct on complying with the PEC regulations that reflect their specific processing operations. There are already similar provisions in articles 40 and 41 of the UK General Data Protection Regulation to help organisations in particular sectors to comply.
Importantly, codes of conduct prepared under these provisions can be contained in the same document as codes of conduct under the UK GDPR. That will be particularly beneficial to representative bodies that are developing codes for processing activities that are subject to the requirements of both the UK GDPR and the PEC regulations. New clause 2 envisages that representative bodies will draw up voluntary codes of conduct and then seek formal approval of them from the Information Commissioner. The Information Commissioner will approve a code only if it contains a mechanism for the representative body to monitor their members’ compliance with the code.
New clause 1 makes a related amendment to article 41 of the UK GDPR to clarify that bodies accredited to monitor compliance with codes of conduct under the GDPR are required to notify the Information Commissioner only if they suspend or exclude a person from a code. Government amendment 5 is a minor and technical amendment necessary as a consequence of new clause 2.
These provisions are being put into the Bill at the suggestion of business organisations. We hope that they will allow organisations to comply more easily with the requirements.
It is a pleasure to serve under your chairship, Mr Paisley, and I too welcome you to the Committee.
As I have said more than once in our discussions, in many cases the burden of following regulations can be eased just as much by providing clarification, guidance and support as by removing regulation altogether. I advocated for codes of practice in more detail in the discussion of such codes in the public sector, under clause 19, and during our debates on clauses 29 and 30, when we were discussing ICO codes more generally. New clauses 1 and 2 seem to recognise the value of codes of practice too, and both seek to provide either clarification or the sharing of best practice in terms of following the PEC regulations. I have no problem with proceeding with the Bill with these inclusions.
Amendment 5 agreed to.
I beg to move amendment 48, in clause 78, page 100, line 30, after “86” insert “and [Pre-commencement consultation]”.
This amendment is consequential on NC7.
New clause 7 clarifies that the consultation requirements imposed by the Bill in connection with or under the PEC regulations can be satisfied by consultation that takes place before the relevant provision of the Bill comes into force. That ensures that the consultation work that supports development of policy before the Bill is passed can continue and is not paused unnecessarily. A similar provision was included in section 182 of the Data Protection Act 2018. Government amendment 48 is a minor and technical amendment which is necessary as a consequence of new clause 7. I commend the new clause and amendment to the Committee.
The new clause and accompanying amendment seek to expedite work on consultation in relation to the measures in this part. It makes sense that consultation can begin before the Bill comes into force, to ensure that regulations can be acted on promptly after its passing. I have concerns about various clauses in this part, but no specific concerns about the overarching new clause, and am happy to move on to discussing the substance of the clauses to which it relates.
Amendment 48 agreed to.
Question proposed, That the clause, as amended, stand part of the Bill.
Clause 78 introduces part 4 of the Bill, which amends the Privacy and Electronic Communications (EC Directive) Regulations 2003. Clauses 79 to 86 refer to them as “the PEC Regulations” for short. They sit alongside the Data Protection Act and the UK GDPR. We will debate some of the more detailed provisions in the next few clauses.
Question put and agreed to.
Clause 78, as amended, accordingly ordered to stand part of the Bill.
Clause 79
Storing information in the terminal equipment of a subscriber or user
I beg to move amendment 116, in clause 79, page 101, line 15, leave out
“making improvements to the service”
and insert
“making changes to the service which are intended to improve the user’s experience”.
Cookies are small text files that are downloaded on to somebody’s computer or smartphone when they access a website; they allow the website to recognise the person’s device, and to store information about the user’s preferences or past actions. The current rules around using cookies, set out in regulation 6 of the PEC regulations, dictate that organisations must tell people that the cookies are there, explain what the cookies are doing and why, and finally get the person’s freely given, specific and informed consent to store cookies on their device. However, at the moment there is almost universal agreement that the system is not working as intended.
To comply with the legislation, most website have adopted what is known as a cookie banner—a notice that pops up when a user first visits the site, prompting them to indicate which cookies they are happy with. However, due to the sheer volume of those banners, in many cases people no longer feel they are giving consent because they are informed or because they freely wish to give it, but are doing so simply because the banners stop them using the website as they wish.
In their communications regarding the Bill, the Government have focused on reducing cookie fatigue, branding it one of the headline achievements of the legislation. Unfortunately, as I will argue throughout our debates on clause 79, I do not believe that the Bill will fix the problem in the way that users hope. The new exemptions to the consent requirement for purposes that present a low risk to privacy may reduce the number of circumstances in which permission might be required, but there will still be a wide-ranging list of circumstances where consent is still required.
If the aim is to reduce cookie fatigue for users, as the Government have framed the clause, the exemptions must centre on the experience of users. If they do not, the clause is not about reducing consent fatigue, but rather about legitimising large networks of online surveillance of internet users. With that in mind, amendment 116 would narrow the exemption for collecting statistical information with a view to improving a service so that it is clear that any such improvements are exclusively considered to be those from the user’s perspective. That would ensure that the term “improvements” cannot be interpreted as including sweeping changes for commercial benefit, but is instead focused only on benefits to users.
I will speak to proposed new regulation 6B when we debate later amendments, but I reiterate that I have absolute sympathy for the intention behind the clause and want as much as anyone to see an end to constant cookie banners where possible. However, we must place the consumer and user experience at the heart of any such changes. That is what we hope to ensure through the amendment, with respect to the list of exemptions.
I am grateful to the hon. Lady for making it clear that the Opposition share our general objective in the clause. As she points out, the intention of cookies has been undermined by their ubiquity when they are placed as banners right at the start. Clause 79 removes the requirement to seek consent for the placement of audience measurement cookies. That means, for example, that a business could place cookies to count the number of visitors to its website without seeking the consent of web users via a cookie pop-up notice. The intention is that the organisation could use the statistical information collected to understand how its service is being used, with a view to improving it. Amendment 116 would mean that “improvements to the service” would be narrowed in scope to mean improvements to the user’s experience of the service, but while that is certainly one desirable outcome of the new exception, we want it to enable organisations to make improvements for their own purposes, and these may not necessarily directly improve the user’s experience of the service.
Organisations have repeatedly told us how important the responsible use of data is for their growth. For example, a business may want to use information collected to improve navigation of its service to improve sales. It could use the information collected to make improvements to the back-end IT functionality of its website, which the user may not be aware of. Or it could even decide to withdraw parts of its service that had low numbers of users; those users could then find that their experience was impaired rather than improved, but the business could invest the savings gained to improve other parts of the service. We do not think that businesses should be prevented from improving services in this way, but the new exception provides safeguards to prevent them from sharing the collected data with anyone else, except for the same purpose of making improvements to the service. On that basis, I hope the hon. Lady will consider withdrawing her amendment.
I am grateful for the Minister’s answer. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 49, in clause 79, page 102, leave out lines 21 to 23.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement that the subscriber or user can object to the update and does not object.
Clause 79 reforms regulation 6 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, which sets the rules on when an organisation can store information or gain access to information stored on a person’s device—for example, their computer, phone or tablet. This is commonly described as the cookies rule, but it includes similar technologies such as tracking pixels and device fingerprinting. Currently, organisations do not have to seek a user’s consent to place cookies that are strictly necessary to provide a service requested by the user—for example, to detect fraud or remember items in a user’s online shopping basket.
To reduce the number of cookie pop-up notices that can spoil web users’ enjoyment of the internet, clause 79 will remove the requirement for organisations to seek consent for several low privacy risk purposes, including the installation of software updates necessary for the security of the device. Government amendments 49 and 51 remove the user’s right to opt out of the software security update and the right to remove an update after it has taken effect. Government amendment 50 removes the right to disable an update before it takes effect.
Although these measures were initially included in the Bill to give web users a choice about whether security updates were installed, stakeholders have subsequently advised us that the failure to install certain updates could result in a high level of risk to the security of users’ devices and personal information. We have been reflecting on the provisions since the Bill was introduced, and have concluded that removing them is the right thing to do, in the interests of security of web users. Even if these provisions are omitted, organisations will still need to provide users with clear and comprehensive information about the purpose of software security updates. Web users will also still have the right to postpone an update for a limited time before it takes effect.
Government amendment 54 concerns the regulation-making powers under the new PEC regulations. One of the main aims is to ensure that web users are empowered to use automated technology such as browsers and apps to select their choices regarding which cookies they are willing to accept. The Secretary of State could use powers under these provisions to require consent management tools to meet certain standards or specifications. so that web users can make clear, meaningful choices once and have those choices respected throughout their use of the internet.
The Committee will note that new regulation 6B already requires the Secretary of State to consult the Information Commissioner and other interested parties before making any new regulations on consent management tools. Government amendment 54 adds the Competition and Markets Authority as a required consultee. That will help ensure that any competition impacts are properly considered when developing new regulations that set standards of design.
Finally, Government amendments 52 and 53 make minor and technical changes that will ensure that future regulations made under the reformed PEC regulations can include transitional, transitory or savings provisions. These will simply ensure there is a smooth transition to the new regime if the Secretary of State decides to make use of these new powers. I commend the amendments to the Committee.
I understand that amendments 49 to 51 primarily remove the option for subscribers or users to object to or disable an update or software for security reasons. As techUK has highlighted, the PEC regulations already contain an exemption on cookie consent for things that are strictly necessary, and it was widely accepted that security purposes met this exemption. This is reflected by its inclusion in the list of things that meet the criteria in new paragraph (5).
However, in the Bill the Government also include security updates in the stand-alone exemption list. This section introduces additional conditions that are not present in the existing law, including the requirement to offer users an opt-out from the security update and the ability to disable or postpone it. The fact that this overlap has been clarified by removing the additional conditions seems sensible. Although user choice has value, it is important that we do not leave people vulnerable to known security flaws.
In principle, Government amendment 54 is a move in the right direction. I will speak to regulation 6B in more detail when we discuss amendment 117 and explain why we want to remove it. If the regulation is to remain, it is vital that the Competition and Markets Authority be consulted before regulations are made due to the impact they will likely have in entrenching power in the hands of browser owners. That the Government have recognised that it was an oversight not to involve the CMA in any consultations is really pleasing. I offer my full support to the amendment in that context, though I do not believe it goes far enough and will advocate the removal of regulation 6B entirely in due course.
Amendment 49 agreed to.
Amendments made: 50, in clause 79, page 102, line 25, leave out “disable or”.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement for subscribers and users to be able to disable, not just postpone, the update.
Amendment 51, in clause 79, page 102, leave out lines 27 to 29.
Clause 79 amends regulation 6 of the PEC Regulations to create new exceptions from the prohibition on storing and accessing information in terminal equipment. New paragraph (2C) contains an exception for software updates that satisfy specified requirements. This amendment removes a requirement that, where the update takes effect, the subscriber or user can remove or disable the software.
Amendment 52, in clause 79, page 104, line 20, leave out “or supplementary provision” and insert
“, supplementary, transitional, transitory or saving provision, including provision”.—(Sir John Whittingdale.)
This amendment provides that regulations under the new regulation 6A of the PEC Regulations, inserted by clause 79, can include transitional, transitory or saving provision.
I beg to move amendment 117, in clause 79, page 104, line 32, leave out from the beginning to end of line 38 on page 105.
I begin by re-emphasising my overarching support for exploring ways to reduce consent fatigue and cookie banners. However, because of the direction that new regulation 6B takes us in, it requires far more consultation before entering the statute book. My amendment seeks to remove it. Regulation 6B aims, at some point in the future, to enable users to express any consent they wish to give or objections they wish to make regarding cookies to an operator of a website—commonly a browser—so that this can be done automatically on visiting the website. The three main concerns I have with this must be addressed and consulted on before such a regulation becomes law.
I am concerned that it will pose concerns for competition if browsers, often owned by powerful global tech companies, are given centralised control and access to data surrounding cookies across the entire internet. That concern was echoed by the Advertising Association and the CEO of the Data and Marketing Association during an oral evidence session. When asked whether there was any concern that centralising cookies by browser will entrench power in the hands of the larger tech companies that own the browsers, Chris Combemale answered:
“It certainly would give even greater market control to those companies.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 21, Q43.]
He said:
“If anything, we need more control in the hands of the people who invest in creating the content”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 21, Q42.]
online.
As the hon. Lady sets out, amendment 117 would remove new regulation 6B from the Bill, but we see this as an important tool for reducing frequent cookie consent banners and pop-ups that can, as we have debated already, interfere with people’s use of the internet. Members will be aware, as has already been set out, that clause 79 removes the need for organisations to seek consent to place cookies for certain non-intrusive purposes. One way of further reducing the need for repeated cookie pop-up notices is by blocking them at source—in other words, allowing web users to select which cookies they are willing to accept and which they are not comfortable with by using browser-level settings or similar technologies. These technologies should allow users to set their online preferences once and be confident that those choices will be respected throughout their use of the internet.
We will continue to work with the industry and the Information Commissioner to improve take-up and effectiveness of browser-based and similar solutions. Retaining the regulation-making powers at 6B is important to this work because it will allow the Secretary of State to require relevant technologies to meet certain standards or specifications.
Without regulations, there could be an increased risk of companies developing technologies that did not give web users sufficient choice and control about the types of cookies they are willing to accept. We will consult widely before making any new regulations under 6B, and new regulations will be subject to the affirmative resolution procedure. We have listened to stakeholders and intend to amend 6B to provide an explicit requirement for the Secretary of State to consult the Competition and Markets Authority before making new regulations.
Is this something the Department has considered? For example, Google Chrome has a 77% share of the web browser market on desktop computers, and over 60% for all devices including mobile devices. Although we want to improve the use of the internet for users and get rid of unwanted cookies, the consequence would be the consolidation of power in the hands of one or two companies with all that data.
I entirely agree with my hon. Friend. He accurately sums up the reason that the Government decided it was important that the Competition and Markets Authority would have an input into the development of any facility to allow browser users to set their preferences at the browser level. We will see whether, with the advent of other browsers, AI-generated search engines and so on, the dominance is maintained, but I think he is absolutely right that this will remain an issue that the Competition and Markets Authority needs to keep under review.
That is the purpose of Government amendment 54, which will ensure that any competition impacts are considered properly. For example, we want any review of regulations to be relevant and fair to both smaller publishers and big tech. On that basis, I hope that the hon. Member for Barnsley East will consider withdrawing her amendment.
I appreciate the Minister’s comments and the Government change involving the CMA, but we simply do not believe that that is worth putting into law. We just do not know the full implications, as echoed by the hon. Member for Folkestone and Hythe. I will therefore press my amendment to a Division.
Question put, That the amendment be made.
I shall not repeat all that has been said about the purpose of the clause. To recap quickly, consent is required for any non-essential functions, such as audience measurement, design optimisation, presentation of adverts and tracking across websites but, clearly, the current system is not working well. Researchers found that people often click yes to cookies to make the banner go away and because they want to access the service quickly.
The clause will remove the requirement for organisations to seek consent to cookies placed for several low privacy risk purposes. As a result of the new exceptions we are introducing, web users should know that if they continue to see cookie pop-up messages it is because they relate to more intrusive uses of cookies. It is possible that we may identify additional types of non-intrusive cookies in the future, so the clause permits the Secretary of State to make regulations amending the exceptions to the consent requirement or introducing new exceptions.
The changes will not completely remove the existence of cookie pop-ups. However, we are committed to working with tech companies and consumer groups to promote technologies that help people to set their online preferences at browser level or by using apps. Such technology has the potential to reduce further the number of pop-ups that appear on websites. Alongside the Bill, we will take forward work to discuss what can be done further to develop and raise awareness of possible technological solutions. On that basis, I commend the clause to the Committee.
I spoke in detail about my issues with the clause during our debates on amendments 116 and 117, but overall I commend the Government’s intention to explore ways to end cookie fatigue. Although I unfortunately do not believe that these changes will solve the issues, it is pleasing that the Government are looking at ways to reduce the need for consent where the risk for privacy is low. I will therefore not stand in the way of the clause, beyond voicing my opposition to regulation 6B.
Question put and agreed to.
Clause 79, as amended, accordingly ordered to stand part of the Bill.
Clause 80
Unreceived communications
Question proposed, That the clause stand part of the Bill.
Clause 80 provides an additional power for the Information Commissioner when investigating unsolicited direct marketing through telephone calls, texts and emails—more commonly known as nuisance calls or nuisance communications.
Some unscrupulous direct marketing companies generate hundreds of thousands of calls to consumers who have not consented to be contacted. That can affect the most vulnerable in our society, some of whom may agree to buy products or services that they did not want or cannot afford. Successive Governments have taken a range of actions over the years—for example, by banning unsolicited calls from claims management firms and pensions providers—but the problem persists and further action is needed.
Under the Privacy and Electronic Communications (EC Directive) Regulations 2003, the Information Commissioner can investigate and take enforcement action against rogue companies where there is evidence that unsolicited marketing communications have been received by the recipient. The changes we are making in clause 80 will enable the Information Commissioner to take action in relation to unsolicited marketing communications that have been generated, as well as those received or connected.
Not every call that is generated reaches its intended target. For example, an individual may be out or may simply not pick up the phone. However, the potential for harm should be a relevant factor in any enforcement action by the Information Commissioner’s Office. The application of the regulations, through the changes in clause 80, to communications generated will more accurately reflect the level of intent to cause disturbance.
Clause 81 is a minor and technical clause that should improve the readability of the PEC regulations. The definition of “direct marketing”, which the PEC regulations rely on, is currently found in the Data Protection Act 1998. To help the reader quickly locate the definition, the clause adds the definition to the PEC regulations themselves.
Under the current PEC regulations, businesses can already send direct marketing to existing customers, subject to certain safeguards. That is sometimes known as the soft opt-in rule. Clause 82 applies the same rule to non-commercial organisations, such as charities. The changes will mean that charitable, political and non-commercial organisations will be able to send direct marketing communications to persons who have previously expressed an interest in the organisation’s aims and ideals.
The current soft opt-in rules for business are subject to certain safeguards. We have applied the same safeguards to these new provisions for non-commercial organisations. We think these changes will help non-commercial organisations, including charities and political parties, to build ongoing relationships with their supporters. There is no good reason why the soft opt-in rule should apply to businesses but not to non-commercial organisations. I hope Members will see the benefit of these measures in ensuring the balance between protecting the most vulnerable in society and supporting organisations. I commend clauses 80 to 82 to the Committee.
As I have said many times during our discussion of the Bill, I believe that the Information Commissioner should be given proportionate powers to investigate and take action where that is needed to uphold our regulations. That is no less the case with clause 80, which introduces measures that allow the Information Commissioner to investigate organisations responsible for generating unsolicited direct marketing communications, even if they are not received by anyone.
Clause 81 simply lifts the definition of “direct marketing” from the Data Protection Act 1998 and places it into the PEC regulations to increase the readability of that legislation. I have no issues with that.
Clause 82 extends the soft opt-in rules to charities and non-commercial organisations. It is only right that the legislation is consistent in offering non-profits the opportunity to send electronic marketing communications in the same way as for-profit organisations. It might, however, be worth raising the public’s awareness of the rule and of the ability to opt out at any point. If they suddenly find themselves on the end of such communications, they will have a clear understanding of why that is the case and that consent may be withdrawn if they so wish.
Question put and agreed to.
Clause 80 accordingly ordered to stand part of the Bill.
Clauses 81 and 82 ordered to stand part of the Bill.
Clause 83
Direct marketing for the purposes of democratic engagement
I beg to move amendment 55 in clause 83, page 107, line 41, leave out ‘or transitional’ and insert ‘, transitional, transitory or saving’.
This amendment provides that regulations under clause 83 can make transitory or saving provision.
With this it will be convenient to discuss the following:
Clauses 83 and 84 stand part.
Before I speak to the amendment, I will set out the provisions of clause 83, which gives the Secretary of State the power to make exceptions to the PEC regulations’ direct marketing provisions for communications sent for the purposes of democratic engagement. We do not intend to use the powers immediately because the Bill contains a range of other measures that will facilitate a responsible use of personal data for the purposes of political campaigning, including the extension of the soft opt-in rule that we have just debated. However, it is important we keep the changes we are making in the Bill under review to make sure that elected representatives and parties can continue to engage transparently with the electorate and are not unnecessarily constrained by data protection and privacy rules.
The Committee will note that if the Secretary of State decided to exercise the powers, there are a number of safeguards in the clause that will maintain a sensible balance between the need for healthy interaction with the electorate and any expectations that an individual might have with regard to privacy rights. Any new exceptions would be limited to communications sent by the individuals and organisations listed in clause 83, including elected representatives, registered political parties and permitted participants in referendum campaigns.
Before laying any regulations under the clause, the Secretary of State will need to consult the Information Commissioner and other interested parties, and have specific regard for the effect that further exceptions could have on the privacy of individuals. Regulations will require parliamentary approval via the affirmative resolution procedure. Committee members should also bear in mind that the powers will not affect an individual’s right under the UK GDPR to opt out of receiving communications.
We have also tabled two technical amendments to the clause to improve the way it is drafted. Government amendment 55 will make it clear that regulations made under this power can include transitory or savings provisions in addition to transitional provisions. Such provisions might be necessary if, for example, new exceptions were only to apply for a time-limited period. Clause 84 is also technical in nature and simply sets out the meaning of terms such as “candidate”, “elected representative” and “permitted participant” for the purposes of clause 83.
The clauses mirror somewhat the involvement of democratic engagement purposes on the recognised legitimate interests list. However, here, rather than giving elected representatives and the like an exemption from completing a balancing test when processing under this purpose, the Bill paves the way for them to be exempt from certain direct marketing provisions in future.
The specific content of any future changes, however, should be properly scrutinised. As such, it is disappointing that the Government have not indicated how they intend to use such regulations in future. I appreciate that the Minister has just said that they do not intend to use them right now. Does he have in mind any examples of any exemptions that he might like to make from the direct marketing provisions for democratic engagement purposes? That is not to say that such exemptions will not be justified; just that their substance should be openly discussed and democratically scrutinised.
As I have set out, the existing data protection provisions remain under the GDPR. In terms of specific exemptions, I have said that the list will be subject to future regulation making, which will be also subject to parliamentary scrutiny. We will be happy to supply a letter to the hon. Lady to set out specific examples of where that might be the case.
Amendment 55 agreed to.
Clause 83, as amended, ordered to stand part of the Bill.
Clause 84
Meaning of expressions in section 83
Amendment made: 31, in clause 84, page 110, line 31, leave out “fourth day after” and insert
“period of 30 days beginning with the day after”.—(Sir John Whittingdale.)
Clauses 83 and 84 enable regulations to make exceptions from direct marketing rules in the PEC Regulations, including for certain processing by elected representatives. This amendment increases the period for which former members of the Westminster Parliament and the devolved legislatures continue to be treated as "elected representatives" following an election. See also NC6 and Amendment 30.
Clause 84, as amended, ordered to stand part of the Bill.
Clause 85
Duty to notify the Commissioner of unlawful direct marketing
I beg to move amendment 56, in clause 85, page 112, line 35, at end insert—
“(13A) Regulations under paragraph (13) may make transitional provision.
(13B) Before making regulations under paragraph (13), the Secretary of State must consult—
(a) the Commissioner, and
(b) such other persons as the Secretary of State considers appropriate.”
This amendment enables regulations changing the amount of a fixed penalty under regulation 26B of the PEC Regulations to include transitional provision. It also requires the Secretary of State to consult the Information Commissioner and such other persons as the Secretary of State considers appropriate before making such regulations.
With this it will be convenient to discuss the following:
Amendment 118, in clause 85, page 113, line 3, at end insert—
“(1A) Guidance under this section must—
(a) make clear that a provider of a public electronic communications service is not obligated to monitor the content of individual electronic communications in order to determine whether those communications contravene the direct marketing regulations; and
(b) include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening or has contravened any of the direct marketing regulations.”
Government amendment 33.
Clause stand part.
Before I speak to Government amendment 56, it might be helpful to set out the provisions of clause 85. The clause will help to ensure that there is better co-operation between the industry and the regulator in tackling the problem of nuisance communications. It places a duty on public electronic communications service and network providers to notify the Information Commissioner within 28 days if they have “reasonable grounds” for suspecting that unlawful direct marketing communications are transiting their services or networks. Once notified, the ICO will investigate whether a breach of the PEC regulations has occurred and take appropriate action where necessary.
We cannot expect network and service providers to know for certain whether a customer has agreed to receive a marketing call, which is why the new requirement is predicated on the organisation having reasonable grounds for suspecting that something unlawful is occurring. For example, there might be cases where a communications network or service provider notices a large volume of calls being generated in quick succession, with only one digit in the telephone number changing each time. That might suggest that calls are being made indiscriminately, without regard to whether the customer has registered with the telephone preference service or previously advised the caller that they did not want to be contacted.
We do not envisage that the provision will place significant new burdens on the network and service providers. It does not require them to put new systems in place to monitor for suspicious activities. However, where they have that capability already and have reasonable grounds to believe that unlawful activity is going on, we would like them to share that information with the ICO. The clause also requires the ICO to produce and publish guidance for network and service providers to help them to understand what intelligence information could reasonably be shared.
I shall respond to amendment 118 after the hon. Member for Barnsley East has spoken to it, but it might be helpful for me briefly to explain Government amendment 56. The fixed penalty for failure to comply with the duty, which is currently set at £1,000, is being kept under review. Where appropriate, the Secretary of State can use regulations to change the fine amount. The amendment will ensure that those regulation-making powers are consistent with similar powers elsewhere in the Bill. The regulations could include transitional provisions, and the amendment will also require the Secretary of State to consult the Information Commissioner and other persons they consider appropriate before making such regulations.
Government amendment 33 is a minor and technical change designed to improve the readability of the legislation.
The amount is fixed in the Bill at £1,000, Minister. That is stated at clause 85 in proposed new regulation 26B. The Bill states:
“The amount of a fixed monetary penalty under this regulation shall be £1,000.”
That does not indicate any flexibility. I draw that to the attention of the Committee.
The ambition of the clause is broadly welcome, and we agree that there is a need to tackle unwanted calls, but the communications sector, including Vodafone and BT, as well as techUK, has shared concerns that the clause, which will place a new duty on telecoms providers to report to the commissioner whenever they have “reasonable grounds” for suspecting a breach of direct marketing regulations, might not be the best way to solve the issue.
I will focus my remarks on highlighting those concerns, and how amendment 118 would address some of them. First, though, let me say that the Government have already made it clear in their explanatory notes that it is not the intention of the Bill to require providers to monitor communications. However, that has not been included in the Bill, which has caused some confusion in the communications sector.
Amendment 118 would put that confusion to rest by providing for the explicit inclusion of the clarification in the clause itself. That would provide assurances to customers who would be sure their calls and texts would not be monitored, and to telecoms companies, which would be certain that such monitoring of content was absolutely not required of them.
Secondly, the intent of the clause is indeed not to have companies monitoring communications, but many relevant companies have raised concerns around the technological feasibility of identifying instances of unlawful and unsolicited direct marketing. Indeed, the new duty will require telecommunications providers to be able to identify whether a person receiving a direct marketing call has or has not given consent to receive the call from the company making it. However, providers have said they cannot reliably know that, and have warned that there is no existing technology to conduct that kind of monitoring accurately and at scale. In the absence of communication monitoring and examples of how unsolicited direct marketing is to be identified, it is therefore unclear how companies will fulfil their duties under the clause.
That is not to say the industry is not prepared to commit significant resources to tackling unwanted calls. BT, for example, has set up a range of successful tools to help customers. That includes BT Call Protect, which is used by 4.4 million BT customers and now averages 2.35 million calls diverted per week. However, new measures must be feasible, and our amendment 118 would therefore require that guidance around the implementation of the clause include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening, or has contravened, any of the direct marketing regulations.
If the Minister does not intend to support the amendment, I would like to hear such examples from him today, so that the communications sector was absolutely clear about how to fulfil its new duties, given the technology available.
As the hon. Lady has said, amendment 118 would require the commissioner to state clearly in the guidance that the new duty does not oblige providers to intercept or monitor the content of electronic communications in order to determine whether there has been a contravention of the rules. It would also require the guidance to include illustrative examples of the types of activity that may cause a provider reasonably to suspect that there had been a contravention of the requirements.
I recognise that the amendment echoes concerns that have been raised by communications service providers, and that there has been some apprehension about exactly what companies will have to do to comply with the duty. In response, I would emphasise that “reasonable grounds” does mean reasonable in all circumstances.
The hon. Lady has asked for an example of the kind of activity that might give reasonable grounds for suspicion. I direct her to the remarks I made in moving the amendment and the example of a very large number of calls being generated in rapid succession in which, in each case, the telephone number is simply one digit away from the number before. The speed at which that takes place does provide reasonable grounds to suspect that the requirement to, for instance, check with the TPS is not being fulfilled.
There are simple examples of that kind, but I draw the attention of the hon. Lady and the Committee to the consultation requirements that will apply to the ICO’s guidance. In addition to consulting providers of public electronic communications networks and services on the development of the guidance, the ICO will be required to consult the Secretary of State, Ofcom and other relevant stakeholders to ensure that the guidance is as practical and useful to organisations as possible.
Does my right hon. Friend agree that, if amendment 118 were made, it could be used as a general get-out-of-jail-free card by companies? Let us consider, for example, a situation where a company could easily and obviously have spotted a likely breach of the regulations and should have intervened. When the commissioner discovered that the company had failed in its duty to do so, the company could turn around and say, “Well, yes, we missed that, but we were not under any obligation to monitor.” It is therefore important that there is a requirement for companies to use their best endeavours to monitor where possible.
I completely agree; my hon. Friend is right to make that distinction. Companies should use their best endeavours, but it is worth repeating that the guidance does not expect service and network providers to monitor the content of individual calls and messages to comply with the duty. There is more interest in patterns of activity on networks, such as where a rogue direct marketing firm behaves in the manner that I set out. On that basis, I ask the hon. Lady not to press her amendment to a vote.
I appreciate the Minister’s comments and those of the hon. Member for Folkestone and Hythe. We have no issue with the monitoring of patterns; we wanted clarification on the content. I am not sure that the Minister addressed the concerns about the fact that, although the Government have provided a partial clarification in the explanatory notes, this is not in the Bill. For that reason, I will press my amendment to a vote.
Amendment 56 agreed to.
Amendment proposed: 118, in clause 85, page 113, line 3, at end insert—
“(1A) Guidance under this section must—
(a) make clear that a provider of a public electronic communications service is not obligated to monitor the content of individual electronic communications in order to determine whether those communications contravene the direct marketing regulations; and
(b) include illustrative examples of the grounds on which a provider may reasonably suspect that a person is contravening or has contravened any of the direct marketing regulations.”—(Stephanie Peacock.)
Question put, That the amendment be made.
I beg to move amendment 57, in clause 86, page 113, line 38, at end insert—
“(13A) Regulations under paragraph (13) may make transitional provision.
(13B) Before making regulations under paragraph (13), the Secretary of State must consult—
(a) the Information Commissioner, and
(b) such other persons as the Secretary of State considers appropriate.”
This amendment enables regulations changing the amount of a fixed penalty under regulation 5C of the PEC Regulations to include transitional provision. It also requires the Secretary of State to consult the Information Commissioner and such other persons as the Secretary of State considers appropriate before making such regulations.
With this it will be convenient to discuss the following:
Clause stand part.
Government amendments 32 and 58.
That schedule 10 be the Tenth schedule to the Bill.
Before turning specifically to the provisions of the amendment, I will set out the provisions of clause 86 and schedule 10. Clause 86 updates the ICO’s powers in respect of enforcing the PEC regulations. Currently, the ICO has to rely mainly on outdated powers in the Data Protection Act 1998 to enforce breaches of the PEC regulations. The powers were not updated when the UK GDPR and the Data Protection Act came into force in 2018. That means that some relatively serious breaches of the PEC regulations, such as nuisance calls being generated on an industrial scale, cannot be investigated as effectively or punished as severely as breaches under the data protection legislation.
The clause will therefore give the ICO the same investigatory and enforcement powers in relation to breaches of the PEC regulations as currently apply to breaches of the UK GDPR and the 2018 Act. That will result in a legal framework that is more consistent and predictable for organisations, particularly for those with processing activities that engage both the PEC regulations and the UK GDPR.
Clause 86 and schedule 10 add a new schedule to the PEC regulations, which sets out how the investigatory and enforcement powers in the 2018 Act will be applied to the PEC regulations. Among other things, that includes the power for the Information Commissioner to impose information notices, assessment notices, interview notices and enforcement and penalty notices. The maximum penalty that the Information Commissioner can impose for the most serious breaches of the PEC regulations will be increased to the same levels that can be imposed under the UK GDPR and the Data Protection Act. That is up to 4% of a company’s annual turnover or £17.5 million, whichever is higher.
Relevant criminal offences under the Data Protection Act, such as the offence of deliberately frustrating an investigation by the Information Commissioner by destroying or falsifying information, are also applied to the PEC regulations. The updated enforcement provisions in new schedule 1 to the PEC regulations will retain some pre-existing powers that are unique to the previous regulations.
Clause 86 also updates regulation 5C of the PEC regulations, which sets out the fixed penalty amount for a failure to report a personal data breach under regulation 5. Currently, the fine level is set at £1,000. The clause introduces a regulation-making power, which will be subject to the affirmative procedure, for the Secretary of State to increase the fine level. We have tabled Government amendment 57 to provide an explicit requirement for the Secretary of State to consult the Information Commissioner and any other persons the Secretary of State considers appropriate before making new regulations. The amendment also confirms that regulations made under the power can include transitional provisions.
Finally, we have tabled two further minor amendments to schedule 10. Government amendment 58 makes a minor correction by inserting a missing schedule number. Government amendment 32 adjusts the provision that applies section 155(3)(c) of the Data Protection Act for the purposes of the PEC regulations. That is necessary as that section is being amended by schedule 4. Without making those corrective amendments, the provisions will not achieve the intended effect.
Clause 86 and schedule 10 insert and clarify the commissioner’s enforcement powers with regards to privacy and electronic communications regulation. Particularly of note within the proposals is the move to increase fines for nuisance calls and messages to a higher maximum penalty of £17.5 million or 4% of the undertaking’s total annual worldwide turnover, whichever is higher. That is one of the Government’s headline commitments in the Bill and should create tougher punishments for those who are unlawfully pestering people through their phones.
We are in complete agreement that more must be done to stop unwanted communications. However, to solve the problem as a whole, we must take stronger action on scam calling as well as on instances of unsolicited direct marketing. Labour has committed to going further than Ofcom’s new controls on overseas scam calls and has proposed the following to close loopholes: first, no phone call made from overseas using a UK telephone number should have that number displayed when it appears on a UK mobile phone or digital landline; and secondly, all mobile calls from overseas using a UK number should be blocked unless the network provider confirms that the known bill payer for the number is currently roaming. To mitigate the fact that some legitimate industries rely on overseas call centres that handle genuine customer service requests, we will also require Ofcom to register those legitimate companies and their numbers as exceptions to the blocking.
As the clause and schedule seek to take strong action against unwanted communications, I would be pleased to hear from the Minister whether the Government would consider going further and matching our commitments on overseas scam calling, too.
I say to the hon. Lady that the provisions deal specifically with nuisance calls, not necessarily scam calls. As she will know, the Government have a comprehensive set of policies designed to address fraud committed through malicious or scam calls, and those are being processed through the fraud prevention strategy. I accept that more needs to be done and say to her that it is already taking place.
Amendment 57 agreed to.
Clause 86, as amended, ordered to stand part of the Bill.
Schedule 10
Privacy and electronic communications: Commissioner’s enforcement powers
Amendments made: 32, in schedule 10, page 180, line 25, leave out “for “data subjects”” and insert
“for the words from “data subjects” to the end”.
This amendment adjusts provision applying section 155(3)(c) of the Data Protection Act 2018 (penalty notices) for the purposes of the PEC Regulations to take account of the amendment of section 155(3)(c) by Schedule 4 to the Bill.
Amendment 58, in schedule 10, page 183, line 5, at end insert “15”.—(John Whittingdale.)
This amendment inserts a missing Schedule number, so that the provision refers to Schedule 15 to the Data Protection Act 2018.
Schedule 10, as amended, agreed to.
Clause 87
The eIDAS Regulation
Question proposed, That the clause stand part of the Bill.
Clauses 87 to 91 make changes to the UK’s eIDAS regulation to support the effective functioning of the UK’s trust services market into the future. Clause 87 states that when clauses 88 to 91 talk about the eIDAS regulation, this refers to regulation 910/2014, on electronic identification and trust services for electronic transactions in the internal market, which was adopted by the European Parliament and the European Council on 23 July 2014.
There is potential for confusion between the UK eIDAS regulation and the EU eIDAS regulation from which it stems and which shares the same title. I can confirm that all references to the eIDAS regulation in clauses 88 to 91 refer to the regulation as it was retained and modified on EU exit to apply within the UK.
Clause 88 amends the UK eIDAS regulation so that conformity assessment reports issued by an accredited EU conformity assessment body can be recognised and used to grant a trust service provider qualified status under the regulation. UK-qualified trust services are no longer legally recognised within the EU, which has meant that qualified trust service providers who wish to operate within both the UK and the EU need to meet two sets of auditing requirements. That is not cost effective and creates regulatory barriers in the nascent UK trust services market. Unilateral recognition of EU conformity assessment bodies will remove an unnecessary regulatory barrier for qualified trust service providers wishing to operate within both the UK and EU markets.
Clause 89 provides the Secretary of State with a power to revoke articles 24A and 24B of the UK eIDAS regulation in the future, should the continued unilateral recognition of EU-qualified trust services, and the recognition of conformity assessment reports issued by EU conformity assessment bodies, no longer meet the needs of the UK market. Clause 89 also provides a power to amend article 24A in order to wind down the recognition of EU-qualified trust services, by removing the recognition of certain elements of EU-qualified trust service standards only.
For example, it will be possible to continue to recognise EU-qualified electronic time stamps and delivery services while ending the recognition of EU-qualified electronic signatures and seals, which will give the UK eIDAS regulation flexibility to adapt to future changes. The clause provides that any regulations made under this power will be subject to the negative resolution procedure.
“Trust services” refers to services including those relating to electronic signatures, electronic seals, timestamps, electronic delivery services and website authentication. As has been mentioned, trust services are required to meet certain standards and technical specifications for operation across the UK economy, which are outlined under eIDAS regulations. These clauses seek to make logistical adjustments to that legal framework for trust service products and services within in the UK.
Although we understand that the changes are intended to enable flexibility in case EU regulations should no longer be adequate, and absolutely agree that we must future-proof regulations to ensure that standards are always kept high, we must also ensure that any changes made are necessary, to ensure that standards remain high, rather than being made simply for their own sake. It is vital that any alterations made are genuinely intended to improve current practices and have been thoroughly considered to ensure that they are making positive and meaningful change.
Question put and agreed to.
Clause 87 accordingly ordered to stand part of the Bill.
Clauses 88 to 91 ordered to stand part of the Bill.
Clause 92
Disclosure of information to improve public service delivery to undertakings
Question proposed, That the clause stand part of the Bill.
The clause will amend the Digital Economy Act 2017 to extend the powers under section 35 to include businesses. Existing powers enable public authorities to share data to support better services to individuals and households. The Government believe that businesses too can benefit from responsive, joined-up public services across the digital economy. The clause introduces new data sharing powers allowing specified public authorities to share data with other specified public authorities for the purposes of fulfilling their functions.
The sharing of data will also provide benefits for the public in a number of ways. It will pave the way for businesses to access Government services more conveniently, efficiently and securely—by using digital verification services, accessing support when trying to start up new businesses, completing import and export processes or applying for Government grants such as rural grants, for example. Any data sharing will of course be carried out in accordance with the requirements of the Data Protection Act and the UK GDPR.
Being able to share data about businesses will bring many benefits. For example, by improving productivity while keeping employment high we can earn more, raising living standards, providing funds to support our public services and improving the quality of life for all citizens. Now that we have left the EU, businesses that take action to improve their productivity will increase their resilience to changing market conditions and be more globally competitive. The Minister will be able to make regulations to add new public authorities to those already listed in schedule 4 to the Digital Economy Act. However, any regulations would be made by the affirmative procedure, requiring the approval of both Houses. I commend the clause to the Committee.
The clause amends section 35 of the Digital Economy Act to enable specified public authorities to share information to improve the delivery of public services to businesses with other specified persons. That echoes the existing legal gateway that allows for the sharing of information on improving the delivery of public services to individuals and households.
I believe that the clause is a sensible extension, but would have preferred the Minister and his Department to have considered public service delivery more broadly when drafting the Bill. While attention has rightly been paid throughout the Bill to making data protection regulation work in the interests of businesses, far less attention has gone towards how we can harness data for the public good and use it to the benefit of our public services. That is a real missed opportunity, which Labour would certainly have taken.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clause 93
Implementation of law enforcement information-sharing agreements
I beg to move amendment 8, in clause 93, page 119, line 18, leave out first “Secretary of State” and insert “appropriate national authority”.
This amendment, Amendment 10 and NC5 enable the regulation-making power conferred by clause 93 to be exercised concurrently by the Secretary of State and, in relation to devolved matters, by Scottish Ministers and Welsh Ministers.
With this it will be convenient to discuss the following:
Government amendments 9 to 16.
Government new clause 5—Meaning of “appropriate national authority”.
Clause 93 creates a delegated power for the Secretary of State, and a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The concurrent power for Welsh and Scottish Ministers has been included in an amendment to the clause. While international relations are a reserved matter, the domestic implementation of the provisions likely to be contained in future international agreements may be devolved, given that law enforcement is a devolved matter to various extents in each devolved Administration.
In the light of introducing a concurrent power for Welsh and Scottish Ministers, amendments to clauses 93 and 108 have been tabled, as has new clause 5. Together they specifically detail the appropriate national authority that will have the power to make regulations in respect of clause 93. The Government amendments make it clear that the appropriate national authority may make the regulations. New clause 5 then defines who is an appropriate national authority for those purposes. I therefore commend new clause 5 and the related Government amendments to the Committee.
It is right that the powers conferred by clause 93 can be exercised by devolved Ministers where appropriate. I therefore have no objections to the amendments or the new clause.
Amendment 8 agreed to.
Amendments made: 9, in clause 93, page 119, line 18, leave out second “Secretary of State” and insert “authority”.
This amendment is consequential on Amendment 8.
Amendment 10, in clause 93, page 119, line 36, at end insert—
‘“appropriate national authority” has the meaning given in section (Meaning of “appropriate national authority”);’.—(Sir John Whittingdale.)
See the explanatory statement for Amendment 8.
Question proposed, That the clause, as amended, stand part of the Bill.
As I have already set out, clause 93 creates a delegated power for the Secretary of State, along with a concurrent power for Welsh and Scottish Ministers, to make regulations to implement international agreements relating to the sharing of information for law enforcement purposes. The legislation will provide powers to implement technical aspects of such international agreements via secondary legislation once the agreements have been negotiated.
Clause 93 stipulates that regulations can be made in connection with implementing an international agreement only in so far as it relates to the sharing of information for law enforcement purposes, and that any data sharing must comply with data protection legislation. These measures will enable the implementation of new international agreements designed to help keep the public safe from the threat posed by international criminality and cross-border crime, as well as helping to protect vulnerable people.
I believe the position is that at the present time, Northern Ireland does not have a functioning Assembly, so it is not possible, but that may change in due course.
The clause allows the Secretary of State to make regulations to enact an international agreement for the sharing of information for law enforcement purposes. The substance of any such agreement will likely therefore come through secondary legislation and, as such, it will be appropriate at that point to scrutinise their contents. If the Minister and his Department have identified any targets for such agreements at this stage, I am sure that the Committee would be grateful to hear of them. If not, however, I expect that he would update the House of that through the usual channels.
Question put and agreed to.
Clause 93, as amended, accordingly ordered to stand part of the Bill.
Clause 94
Form in which registers of births and deaths are to be kept
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Clauses 95 to 98 stand part.
That schedule 11 be the Eleventh schedule to the Bill.
Clauses 94 to 98 amend the Registration Service Act 1953 and the Births and Deaths Registration Act 1953—which I will refer to as the Act —and introduce schedule 11, which contains minor and consequential amendments. Currently, under the Act, the Registrar General for England and Wales provides the local registration service with paper live birth, stillbirth and death registers and with paper forms for making certified copies of the register entries—for example, birth and death certificates. Since 2009, registrars in England and Wales also record birth and death registration information electronically, in parallel with the paper-based systems. That is a duplication of effort for registrars.
Clause 94(2) amends the Act and substitutes section 25 with a new section 25. The new section will allow the Registrar General to determine in which form registers of live births, stillbirths and deaths are to be kept, and contains additional provision appropriate for the keeping of registers in an electronic form only. New section 25(2) of the Act allows the Registrar General to require that registrars keep information in a form that will allow the Registrar General and the superintendent registrar to have immediate access to all live birth and death entries as soon as the registrar has entered the details in the register. In the case of stillbirths, new section 25(2)(b) allows the Registrar General to have immediate access to the entries in the register.
New section 25(3) provides that where a register is kept in such form as determined under new section 25(2) —for example, an electronic form—any information in that register made available to the Registrar General or superintendent registrar is deemed to be held by that person, as well as the registrar, when carrying out that person’s functions—for example, the issue of certified copies.
Clause 94(3)(a) and (b) omit sections 26 and 27 of the Act, which set out the requirements for the quarterly returns made by a registrar and superintendent registrar. These returns will no longer be needed, as the superintendent registrar and the Registrar General will have immediate access to the records as provided for by new section 25 of the Act.
Clause 94(3)(c) omits section 28 of the Act, which sets out how paper registers must be stored by registrars, superintendent registrars and the Registrar General. With the introduction of new section 25, that provision is no longer necessary as it would not be relevant to an electronic register.
Proposed new section 25(4) of the Act provides that anything that is required for the purposes of creating and maintaining the registers—for example, providing registrars with the electronic system—is the responsibility of the Registrar General. Proposed new section 25(5) of the Act places a responsibility on the Registrar General to provide the required forms that the local registration service will need to produce certified copies of entries—for example, birth and death certificates.
Clauses 94 to 98 amend the Births and Deaths Registration Act, with the overall effect of removing the provision for birth and death records to be kept on paper, and allowing them to be held in an online database. This is a positive move, with the potential to bring many benefits. First, it will improve the functioning of the registration system—for example, it will allow the Registrar General and the superintendent registrar to have immediate access to all birth and death entries as soon as they have been entered into the system. The changes will undoubtedly be important to families who are experiencing joy or loss, because they make registrations easier and more likely to be correct in the first instance, minimising unnecessary clarifications at what can often be a very difficult time. Indeed, one of the recommendations of the 2022 UK Commission on Bereavement’s landmark report, which looked at the key challenges facing bereaved people in this country, was that it should be possible to register deaths online.
It is great that the Government have chosen to pursue this change. However, despite it being the recommendation listed right next to online death registration, the Government have not used this opportunity to explore the potential of extending the Tell Us Once service, which is disappointing. Indeed, the existing Tell Us Once service has proved very helpful to bereaved people in reducing the administrative burden they face, by enabling them to inform a large number of Government and public sector bodies in one process, rather than forcing them to go through the same process time and again. However, private organisations are not included, and loved ones are still tasked with contacting organisations such as employers, energy and electricity companies, banks, telephone and internet providers, and more. At a time of emotional struggle, this is a huge administrative burden to place on the bereaved and leaves them vulnerable to other unsettling variables, such as communication barriers and potentially insensitive customer service.
The commission found that 61% of adult respondents reported experiencing practical challenges when notifying the organisations that need to be made aware of the death of a loved one. We are therefore disappointed that the Government have not explored whether the Bill could extend the policy to the private sector in order to further reduce the burden on grieving friends and families, and make the inevitably difficult process a little easier. Overall, however, the clauses will mark a positive change for families up and down the country, and we are pleased to see them implemented.
I merely say to the hon. Lady that, having used the Tell Us Once service myself in relation to the death of my mother not that long ago, I absolutely hear what she says about the importance of making the process as easy as possible. We will certainly consider what she says.
Question put and agreed to.
Clause 94 accordingly ordered to stand part of the Bill.
Congratulations to the hon. Member for Solihull.
Clauses 95 to 98 ordered to stand part of the Bill.
Schedule 11 agreed to.
Clause 99
Information standards for health and adult social care in England
Question proposed, That the clause stand part of the Bill
With this it will be convenient to discussing the following:
That schedule 12 be the Twelfth schedule to the Bill.
Schedule 12 makes it clear that information standards published under section 250 of the Health and Social Care Act 2012, as amended by the Health and Care Act 2022, can include standards relating to information technology or IT services that are used or intended to be used in connection with the processing of information. The schedule extends the potential application of information standards to the providers of IT products and services to the health and adult social care sector for England. It also introduces mechanisms for monitoring and enforcing compliance by IT providers with information standards, and allows for the establishment of an accreditation scheme for IT products and services.
It is absolutely right that health and care information can flow in a standardised way between different IT systems and across organisational boundaries in the health and adult social care system in England, for the benefit of individuals and their healthcare outcomes. Information standards are vital to enabling that, alongside joint working between everyone involved in the processing of heath and care information.
These changes will support the efficient and effective operation of the health and adult social care system by making it easier for people delivering care to access accurate and complete information when they need it, improve clinical decision making and, ultimately, improve clinical outcomes for patients. The clause is a crucial enabler for the creation of a modern health and care service with systems that are integrated and responsive to the needs of patients and users. I therefore commend it to the Committee.
Information standards govern how data can be shared and compared across a sector. They are important in every sector in which they operate, but particularly in health, where they are critical to enabling the information sharing and interoperability necessary for good patient outcomes across health and social care services. For many reasons, however, we do not have a standard national approach to health data; as such, patients receive a far from seamless experience between different healthcare services. The Bill’s technical amendments and clarifications of existing rules on information standards in health, and how they interact with IT and IT services, are small but good steps in the journey towards trying resolve that.
Tom Schumacher of Medtronic told us in oral evidence that one of the problems faced by his organisation and NHS trusts is
“variability in technical and IT security standards.”
He suggested that harmonising those standards would be a “real opportunity,” since it would mean that
“each trust does not have to decide for itself which international standard to use and which local standard to use.”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 42, Q90.]
However, it is unclear how much headway these IT-related changes will make in providing that harmonisation, let alone the seamless service that patients so often call for.
I have one query that I hope the Minister can help with. MedConfidential has shared with us a concern that new section 251ZE of the Health and Social Care Act 2012 on accreditation of information technology, which is introduced by schedule 12, seems to imply that the Department of Health and Social Care and NHS England will have the power to set data standards in social care. MedConfidential says that would be a major policy shift, and that it seems unusual to implement such a shift through an otherwise unrelated Bill. Will the Minister write to me to clarify whether it is the Government’s intention to have DHSC and NHS England take over the information infrastructure of social care—and, if so, why they have come to that decision?
I am grateful to the hon. Lady for her support in general. I hear the concern that she expressed on behalf of the firm that has been in contact with her. We will certainly look into that, and I will be happy to let her have a written response in due course.
Mr Paisley, might I beg the Committee’s indulgence to correct the record? I incorrectly credited the hon. Member for Solihull for the private Member’s Bill, but it was in fact my hon. Friend the Member for Meriden (Saqib Bhatti). I apologise to him for getting his constituency wrong—
So we will take the congratulations away from Solihull and pass them elsewhere.
I am afraid that congratulations have been removed from Solihull and transferred to Meriden.
Better luck next time, Solihull! Thank you, Minister, for the correction.
Question put and agreed to.
Clause 99 accordingly ordered to stand part of the Bill.
Schedule 12 agreed to.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(1 year, 6 months ago)
Public Bill CommitteesWith this it will be convenient to discuss the following:
Government amendments 44 and 45.
That schedule 13 be the Thirteenth schedule to the Bill.
Clauses 101 to 103 stand part.
We now turn to part 5 of the Bill. Clauses 100 to 103 and schedule 13 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner, which is currently structured as a corporation sole. I should make it clear that the clauses will make no changes to the regulator’s role and responsibilities; all the functions that rest with the Information Commissioner will continue to sit with the new Information Commission.
Clause 100 will establish a body corporate, the Information Commission, to replace the existing regulator, the Information Commissioner. The commission will be governed by an independent board, with chair and chief executive roles, thereby spreading the responsibilities of the Information Commissioner across a larger number of people.
Clause 101 will abolish the office of the Information Commissioner and amend the Data Protection Act 2018 accordingly. To ensure an orderly transfer of functions, the Information Commissioner’s Office will not be abolished until the new body corporate, the Information Commission, is established.
Clause 102 provides for all regulatory and other functions of the Information Commissioner to be transferred to the new body corporate, the Information Commission, once it is established. The clause also provides for references to the Information Commissioner in enactments or other documents to be treated as references to the Information Commission, where appropriate, as a result of the transfer of functions to the new Information Commission.
Clause 103 will allow the Secretary of State to make a scheme for the transfer of property, rights and liabilities, including rights and liabilities relating to employment contracts, from the commissioner to the new commission. The scheme may transfer property such as IT equipment or office furniture, or transfer staff currently employed by the commissioner to the commission. The transfer scheme will be designed to ensure continuity and facilitate a seamless transition to the new Information Commission.
Schedule 13 will insert a new schedule 12A to the Data Protection Act 2018, which describes the nature, form and governance structure of the new body corporate, the Information Commission. The commission will be governed by an independent statutory board, which will consist of a chair and other non-executive members, as well as executive members including a chief executive. The new structure formalises aspects of the existing governance arrangements of the Information Commissioner’s Office and brings the ICO in line with how other UK regulators, such as Ofcom and the Financial Conduct Authority, are governed. The chair of the new commission will be appointed by His Majesty by letters patent on the recommendation of the Secretary of State, as is currently the case for the commissioner.
Schedule 13 also provides for the current Information Commissioner to transfer to the role of chair of the Information Commission for the remainder of their term. I put on record the Government’s intention to preserve the title of Information Commissioner in respect of the chair, in acknowledgment of the fact that the commissioner’s brand is recognised and valued both domestically and internationally. Other non-executive members will be appointed by the Secretary of State, and the chief executive will be appointed by the non-executive members in consultation with the Secretary of State.
Government amendment 45 will allow the chair to appoint the first chief executive on an interim basis and for a term of up to a maximum of 24 months, which will minimise any delay in the transition from the commissioner to the new commission. As drafted, the Bill provides that the chief executive of the commission will be appointed by the non-executive members once they are in place, in consultation with the Secretary of State. The transition from the commissioner to the new Information Commission cannot take place until the board is properly constituted, with, as a minimum, a chair, another non-executive member and a chief executive in place. That requirement would be likely to cause delay to the transition, as the appointment of the non-executive members by the Secretary of State and the chief executive would need to take place consecutively.
Amendment 44 is a minor consequential amendment to paragraph 3(3)(a) of proposed new schedule 12A, making it clear that the interim chief executive is appointed as an executive member.
The amendments seek to minimise any delay in the transfer of functions to the new commission by enabling the appointment of the chief executive to take place in parallel with the appointments process for non-executive members. The appointment of the interim chief executive will be made on the basis of fair and open competition and in consultation with the Secretary of State. I commend clauses 100 to 103, schedule 13 and Government amendments 44 and 45 to the Committee.
It is a pleasure to serve under your chairship once again, Mr Hollobone. The clauses that restructure the Information Commissioner’s Office are among those that the Opposition are pleased to welcome in the Bill.
The Information Commissioner is the UK’s independent regulator for data protection and freedom of information under the Data Protection Act 2018 and the Freedom of Information Act 2000. Under the current system, as the Minister outlined, the Information Commissioner’s Office is a corporation sole, meaning that one person has overall responsibility for data protection and freedom of information, with a group of staff supporting them. However, as the use of data in our society has grown, so too has the ICO, from a team of 10 in 1984 to an organisation with more than 500 staff.
In that context, the corporation sole model is obviously not fit for purpose. Clauses 100 to 103 recognise that: they propose changes that will modernise the Information Commissioner’s Office, turning it into the Information Commission by abolishing the corporation sole and replacing it with a body corporate. It is absolutely right that those changes be made, transforming the regulator into a commission with a broader set-up structure and a board of executives, among other key changes. That will bring the ICO in line with other established UK regulators such as Ofcom and the Financial Conduct Authority, reflect the fact that the ICO is not just a small commissioner’s office, and ensure that it is equipped to deal with the volume of work for which it has responsibility.
It is essential that the ICO remains independent and fair. We agree that moving from an individual to a body will ensure greater integrity, although the concerns that I have raised about the impact of earlier clauses on the ICO’s independence certainly remain. Overall, however, we are pleased that the Government recognise that the ICO must be brought in line with other established regulators and are making much-needed changes, which we support.
Question put and agreed to.
Clause 100 accordingly ordered to stand part of the Bill.
Schedule 13
The Information Commission
Amendments made: 44, in schedule 13, page 195, line 21, after “members” insert
“or in accordance with paragraph 23A”.
This amendment is consequential on Amendment 45.
Amendment 45, in schedule 13, page 204, line 6, at end insert—
“Transitional provision: interim chief executive
23A (1) The first chief executive of the Commission is to be appointed by the chair of the Commission.
(2) Before making the appointment the chair must consult the Secretary of State.
(3) The appointment must be for a term of not more than 2 years.
(4) The chair may extend the term of the appointment but not so the term as extended is more than 2 years.
(5) For the term of appointment, the person appointed under sub-paragraph (1) is ”the interim chief executive”.
(6) Until the expiry of the term of appointment, the powers conferred on the non-executive members by paragraph 11(2) and (3) are exercisable in respect of the interim chief executive by the chair (instead of by the non-executive members).
(7) In sub-paragraphs (5) and (6), the references to the term of appointment are to the term of appointment described in sub-paragraph (3), including any extension of the term under sub-paragraph (4).”—(Sir John Whittingdale.)
The Bill establishes the Information Commission. This new paragraph enables the chair of the new body, in consultation with the Secretary of State, to appoint the first chief executive (as opposed to the appointment being made by non-executive members). It also enables the chair to determine the terms and conditions, pay, pensions etc relating to the appointment.
Schedule 13, as amended, agreed to.
Clauses 101 to 103 ordered to stand part of the Bill.
Clause 104
Oversight of retention and use of biometric material
Question proposed, That the clause stand part of the Bill.
Clause 104 will repeal the role of the Biometrics Commissioner and transfer the casework functions to the Investigatory Powers Commissioner. There is an extensive legal framework to ensure that the police can make effective use of biometrics, for example as part of an investigation to quickly and reliably identify suspects, while maintaining public trust. That includes the Police and Criminal Evidence Act 1984, which sets out detailed rules on DNA and fingerprints, and the Data Protection Act 2018, which provides an overarching framework for the processing of all personal data.
The oversight framework is complicated, however, and there are overlapping responsibilities. The Bio -metrics Commissioner currently has specific oversight responsibilities just for police use of DNA and fingerprints, while the Information Commissioner’s Office regulates the use of all personal data, including biometrics, by any organisation, including the police. Clause 104 will simplify the framework by removing the overlap, leaving the ICO to provide independent oversight and transferring the casework functions to another existing body.
The casework involves extending retention periods in certain circumstances, particularly on national security grounds, and is quasi-judicial in nature. That is why clause 104 transfers those functions to the independent Investigatory Powers Commissioner, which has the necessary expertise, and avoids the conflict of interest that could occur if the functions were transferred to the ICO as regulator. Transparency in police use of biometrics is essential to retaining public trust and will continue through the annual reports of the Forensic Information Databases Service strategy board, the Investigatory Powers Commissioner and the ICO. I commend clause 104 to the Committee.
I will speak in more detail about my more general views on the oversight of biometrics, particularly their private use, when we come to new clauses 13, 14 and 15. However, as I look specifically at clauses 104 and 105, which seek to abolish the currently combined offices of Biometrics Commissioner and Surveillance Camera Commissioner, I would like to draw on the direct views of the Information Commissioner. In his initial response to “Data: a new direction”, which proposed absorbing the functions of the Biometrics Commissioner and Surveillance Camera Commissioner into the ICO, the commissioner said that there were some functions that,
“if absorbed by the ICO, would almost certainly result in their receiving less attention”.
Other functions, he said,
“simply do not fit with even a reformed data protection authority”
with there being
“far more intuitive places for them to go.”
That was particularly so, he said, with biometric casework.
It is therefore pleasing that as a result of the consultation responses the Government have chosen to transfer the commissioner’s biometric functions not to the ICO but to the Investigatory Powers Commissioner, acknowledging the relevant national security expertise that it can provide. However, in written evidence to this Committee, the commissioner reiterated his concern about the absorption of his office’s functions, saying that work is currently being undertaken within its remit that, under the Bill’s provisions, would be unaccounted for.
Given that the commissioner’s concerns clearly remain, I would be pleased if the Minister provided in due course a written response to that evidence and those concerns. If not, the Government should at the very least undertake their own gap analysis to identify areas that will not be absorbed under the current provisions. It is important that this Committee and the office of the Biometrics and Surveillance Camera Commissioner can be satisfied that all the functions will be properly delegated and given the same degree of attention wherever they are carried out. Equally, it is important that those who will be expected to take on these new responsibilities are appropriately prepared to do so.
I am happy to provide the further detail that the hon. Lady has requested.
Question put and agreed to.
Clause 104 accordingly ordered to stand part of the Bill.
Clause 105
Oversight of biometrics databases
I beg to move amendment 123, in clause 105, page 128, line 22, leave out subsections (2) and (3).
With this it will be convenient to discuss the following:
Clause stand part.
New clause 17—Transfer of functions to the Investigatory Powers Commissioner’s Office—
“The functions of the Surveillance Camera Commissioner are transferred to the Investigatory Powers Commissioner.”
Society is witnessing an unprecedented acceleration in the capability and reach of surveillance technologies. Such an acceleration calls for protections and safeguards. Clause 105, however, does the opposite and seeks to abolish both the office of the Surveillance Camera Commissioner and its functions. The explanatory notes to the Bill state that the functions of the office of the Surveillance Camera Commissioner are duplicated and covered by the Information Commissioner’s Office and its CCTV code of practice. That is not the case: the code is advisory only and is primarily concerned with data processes, not with actual surveillance.
Amendment 123 and new clause 17 would retain the functions of the Surveillance Camera Commissioner but transfer them to the Investigatory Powers Commissioner’s Office, thus preserving those necessary safeguards. The IPCO already scrutinises Government activity and deals with the covert use of surveillance cameras, so dealing with overt cameras as well would be a natural extension of its function.
Having outlined my broad concerns about clause 105 when I spoke to clause 104, I will focus briefly on the specific concern raised by the hon. Member for Glasgow North West, which is that the Surveillance Camera Commissioner’s functions will not be properly absorbed.
In evidence to the Committee, the commissioner outlined a number of non-data protection functions in relation to public space surveillance that their office currently carries out, but that, they believe, the Bill does not make provision to transfer. They cite the significant work that their office has undertaken to ensure that Government Departments are able
“to cease deploying visual surveillance systems onto sensitive sites where they are produced by companies subject to the National Intelligence Law of the People’s Republic of China”,
following a November 2022 instruction from the Chancellor of the Duchy of Lancaster. The commissioner says that such non-data protection work, which has received international acclaim, is not addressed in the Bill.
I am therefore hopeful that the explicit mention in amendment 123 that that the functions of the Surveillance Camera Commissioner will be transferred provides a backstop to ensure that all the commissioner’s duties, including the non-data protection work, are accounted for. If the amendment is not accepted, a full-depth analysis should be conducted, as argued previously, with a full response issued to the commissioner’s evidence to ensure that every one of the functions is properly and appropriately absorbed.
I understand the argument that the Surveillance Camera Commissioner’s powers would be better placed with the Investigatory Powers Commissioner, rather than the ICO. Indeed, the commissioner’s evidence to the Committee referenced the interim findings of an independent report it had commissioned, as the hon. Member for Glasgow North West just mentioned. The report found that most of the gaps left by the Bill could be addressed if responsibility for the surveillance camera code moved under the IPCO, harmonising the oversight of traditional and remote biometrics.
I end by pointing to a recent example that shows the value of proper oversight of the use of surveillance. Earlier this year, following a referral from my hon. Friend the Member for Bristol North West (Darren Jones), the ICO found a school in Bristol guilty of unlawfully installing covert CCTV cameras at the edge of their playing fields. Since then, the Surveillance Camera Commissioner has been responding to freedom of information requests on the matter, with more information about the incident thereby emerging as recently as yesterday. It is absolutely unacceptable that a school should be filming people without their knowledge. The Surveillance Camera Commissioner is a vital cog in the machinery of ensuring that incidents are dealt with appropriately. For such reasons, we must preserve its functions.
In short, I am in no way opposed to the simplification of oversight in surveillance or biometrics, but I hope to see it done in an entirely thorough way, so that none of the current commissioner’s duties get left behind or go unseen.
I am grateful to the hon. Members for Glasgow North West and for Barnsley East for the points they have made. The hon. Member for Glasgow North West, in moving the amendment, was right to say that the clause as drafted abolishes the role of the Surveillance Camera Commissioner and the surveillance camera code that the commissioner promotes compliance with. The commissioner and the code, however, are concerned only with police and local authority use in England and Wales. Effective, independent oversight of the use of surveillance camera systems is critical to public trust. There is a comprehensive legal framework for the use of such systems, but the oversight framework is complex and confusing.
The ICO regulates the processing of all personal data by all UK organisations under the Data Protection Act; that includes surveillance camera systems operated by the police and local authorities, and the ICO has issued its own video surveillance guidance. That duplication is confusing for both the operators and the public and it has resulted in multiple and sometimes inconsistent guidance documents covering similar areas. The growing reliance on surveillance from different sectors in criminal investigations, such as footage from Ring doorbells, means that it is increasingly important for all users of surveillance systems to have clear and consistent guidance. Consolidating guidance and oversight will make it easier for the police, local authorities and the public to understand. The ICO will continue to provide independent regulation of the use of surveillance camera systems by all organisations. Indeed, the chair of the National Police Data Board, who gave evidence to the Committee, said that that will significantly simplify matters and will not reduce the level of oversight and scrutiny placed upon the police.
Amendment 123, proposed by the hon. Member for Glasgow North West, would retain the role of the Surveillance Camera Commissioner and the surveillance camera code. In our view, that would simply continue the complexity and duplication with the ICO’s responsibilities. Feedback that we received from our consultation showed broad support for simplifying the oversight framework, with consultees agreeing that the roles and responsibilities, in particular in relation to new technologies, were unclear.
The hon. Lady went on to talk about the oversight going beyond that of the Information Commissioner, but I point out that there is a comprehensive legal framework outside the surveillance camera code. That includes not only data protection, but equality and human rights law, to which the code cross-refers. The ICO and the Equality and Human Rights Commission will continue to regulate such activities. There are other oversight bodies for policing, including the Independent Office for Police Conduct and His Majesty’s inspectorate of constabulary, as well as the College of Policing, which provide national guidance and training.
The hon. Lady also specifically mentioned the remarks of the Surveillance Camera Commissioner about Chinese surveillance cameras. I will simply point out that the responsibility for oversight, which the ICO will continue to have, is not changed in any way by the Bill. The Information Commissioner’s Office continues to regulate all organisations’ use of surveillance cameras, and it has issued its own video surveillance guidance.
New clause 17 would transfer the functions of the commissioner to the Investigatory Powers Commissioner. As I have already said, we believe that that would simply continue to result in oversight resting in two different places, and that is an unnecessary duplication. The Investigatory Powers Commissioner’s Office oversees activities that are substantially more intrusive than those relating to overt surveillance cameras. IPCO’s existing work requires it to oversee over 600 public authorities, as well as several powers from different pieces of legislation. That requires a high level of expertise and specialisation to ensure effective oversight.
For those reasons, we believe that the proposals in the clause to bring the oversight functions under the responsibility of the Information Commissioner’s Office will not result in any reduction in oversight, but will result in the removal of duplication and greater clarity. On that basis, I am afraid that I am unable to accept the amendment, and I hope that the hon. Lady will consider withdrawing it.
I thank the Minister for responding to my amendments. However, we have a situation where we are going from having a specialist oversight to a somewhat more generalist oversight. That cannot be good when we are talking about this fast-moving technology. I will withdraw my amendment for the moment, but I reserve the right to bring it back at a later stage. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 105 ordered to stand part of the Bill.
Clause 106
Oversight of biometrics databases
I beg to move amendment 119, in clause 106, page 130, line 7, leave out
“which allows or confirms the unique identification of that individual”.
This amendment is intended to ensure that the definition of biometric data in the Bill includes cases where that data is used for the purposes of classification (and not just unique identification).
With this it will be convenient to discuss new clause 8—Processing of special categories of personal data: biometric data—
“(1) Article 9 of UK GDPR is amended as follows.
(2) In paragraph (1), after “biometric data”, omit “for the purpose of uniquely identifying a natural person.”
This new clause would extend the same protections that are currently in place for the processing of biometric data for the purposes of identification to the processing of all biometric data, including if the processing is for the purpose of classification (i.e. identification as part of a group, rather than identification as an individual).
Biometric data is uniquely personal. It captures our faces, fingerprints, walking style, tone of voice, expressions and all other data derived from measures of the human body. Under current UK law, biometric data is defined as
“personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person”.
Furthermore, biometric data counts as special category personal data only when it is used or collected for
“the purpose of uniquely identifying a natural person”.
However, as the use of biometrics grows, they are not only used for identification; indeed, there is a growing set of biometric technologies used to categorise or classify people on the basis of traits thought to be statistically related or correlated, however tenuously, with particular characteristics. For instance, biometric systems have been developed that attempt to infer people’s sexuality from their facial geometry, or judge criminality from pictures of people’s faces. Other biometric classification systems attempt to judge people’s internal emotional state or intentions from their biometrics, such as tone, voice, gait or facial expressions, known as emotion recognition. For example, employers have used facial expression and tone analysis to decide who should be selected for a job, using biometric technologies to score candidates on characteristics such as enthusiasm, willingness to learn, conscientiousness and responsibility, and personal stability.
Members of the Citizens’ Biometrics Council convened by the Ada Lovelace Institute in 2020 to build a deeper understanding of the British public’s views on biometric technologies have expressed concerns about these use cases. Members suggest that these technologies classify people according to reductive, ableist and stereotypical characteristics, harming people’s wellbeing and risking characterisation in a database or data-driven systems. Further, these cases often use pseudoscientific assumptions to draw links between external features and other traits, meaning that the underlying bases of these technologies are often not valid, reliable or accurate. For example, significant evidence suggests that it is not possible accurately to infer emotion from facial expressions. Despite that, existing data protection law would not consider biometric data collected for those purposes to be special category data, and would therefore not give data subjects the highest levels of safeguards in these contexts.
Clause 106 makes changes to the national DNA database strategy board, which provides oversight of the operation of the national DNA database, including setting policies for access and use by the police. Amendment 119 would seem to extend the power to widen the board’s potential scope beyond biometrics databases for the purpose of identification, to include the purpose of classification.
The police can process data only for policing purposes. It is not clear what policing purpose there would be in being able to classify, for example, emotions or gender, even assuming it was proven to be scientifically robust, or what sort of data would be on such a database. Even if one were developed in the future, it is likely to need knowledge, skills and resources very different from what is needed to oversee a database that identifies and eliminates suspects based on biometric identification, so it would probably make sense for a different body to carry out any oversight.
New clause 8 aims to make changes in a similar way to amendment 119 in relation to the definition of biometric data for the purposes of article 9 of the GDPR. As the GDPR is not concerned with the police’s use of biometric data for law enforcement purposes, the new clause would apply to organisations that are processing biometric data for general purposes. The aim seems to be to ensure that enhanced protections afforded by GDPR to biometric data used for unique identification purposes also apply to biometric data that is used for classification or categorisation purposes.
The hon. Lady referred to the Ada Lovelace Institute’s comments on these provisions, and its 2022 “Countermeasures” report issued on biometric technologies, but we are not convinced that such a change is necessary. One example in the report was using algorithms to make judgments that prospective employees are bored or not paying attention, based on their facial expressions or tone of voice. Using biometric data to draw inferences about people, using algorithms or otherwise, is not as invasive as using biometric data uniquely to identify someone. For example, biometric identification could include matching facial images caught on closed circuit television to a centrally held database of known offenders.
Furthermore, using biometric data for classification or categorisation purposes is still subject to the general data protection principles in the UK GDPR. That includes ensuring that there is a lawful ground for the processing, that the processing is necessary and proportionate, and is fair and transparent to the individuals concerned. If algorithms are used to categorise and make significant decisions about people based on their biometric characteristics, including in an employment context, they will have the right to be given information about the decision, and to obtain human intervention, as a result of the measures we previously debated in clause 11.
Therefore, we do see a distinction between the use of biometric information for identification purposes and the more general classification which the hon. Lady sought to draw. Though we believe that there is sufficient safeguard already in place regarding possible use of classification by biometric data, given what I have said, I hope that she will consider withdrawing the amendment.
I am grateful to the Minister for his comments. We will be speaking about the private uses of biometric data later, so I beg to ask leave to withdraw my amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
DNA and fingerprints are key tools in helping the police to identify and eliminate suspects quickly and accurately by comparing evidence left at crime scenes with the appropriate files on the national databases. As I previously set out, clause 106 makes changes to the National DNA Database Strategy Board. The board provides oversight of the operation of the database, including setting policies for access and use by the police.
These reforms change the scope of the board to make it clear that they should provide similar oversight of the police fingerprint database, which operates under similar rules. The change brings the legislation up to date with the board’s recently published governance rules. Clause 106 also updates the name of the board to the Forensic Information Databases Strategy Board, to better reflect the broadened scope of its work. We are also taking this opportunity to simplify and future-proof oversight of national police biometric databases. While DNA and fingerprints are well established, biometrics is an area of rapid technological development, including for example the growing use of iris, face and voice recognition. Given the pace of technological change in this area and the benefits of consistent oversight, Clause 106 also includes a power for the Secretary of State to make regulations which make changes to the board’s scope, for example by adding new biometric databases into the board’s remit or to remove them, where a database is no longer used. Such regulations would be subject to the affirmative procedure.
For these reasons, I commend the clause to the Committee.
Clause 106 will primarily increase the scope of the Forensic Information Databases Strategy Board to provide oversight of the national fingerprint database. However, there are also provisions enabling the Secretary of State to add or remove a biometric database that the board oversees, using the affirmative procedure. I would therefore like to ask the Minister whether they have any plans to use these powers regarding any particular databases—or whether this is intended as a measure for future-proofing the Bill in the case of changed circumstances?
I would also like to refer hon. Members to the remarks that I have made throughout the Bill that emphasise a need for caution when transferring the ability to change regulation further into the hands of the Secretary of State alone.
I would add only that this is an area where technology is moving very fast, as I referred to earlier. We think it is right to put in place this provision, to allow an extension if it becomes necessary—though I do not think we have any current plans. It is future-proofing of the Bill.
Question put and agreed to.
Clause 106 accordingly ordered to stand part of the Bill.
Clause 107
Regulations
Question proposed, That the clause stand part of the Bill.
Clause 107 will give the Secretary of State a regulation-making power to make consequential amendments to other legislation. The power enables amendments to this Bill itself where such amendments are consequential to the abolition of the Information Commissioner and his replacement by the new Information Commission. Such provision is needed because there are a number of areas where data protection legislation will need to be updated as a consequence of the Bill. This is a standard power, commonly included in Bills to ensure that wider legislation is updated where necessary as a result of new legislation. For example, references to “the Commissioner” in the Data Protection Act 2018 will no longer be accurate, given changes to the governance structure of the Information Commissioner’s Office within the Bill, so consequential amendments will be required to that Act.
Clause 108 outlines the form and procedure for making regulations under the powers in the Bill: they are to be made by statutory instrument. Where regulations in the Bill are subject to the affirmative resolution procedure, they may not be made unless a draft of the statutory instrument has been laid before Parliament and approved by a resolution of each House. That provision is needed because the Bill introduces new regulation-making powers, which are necessary to support the Bill’s policy objectives. For example, powers in part 3 of the Bill replace an existing statutory framework with a new, enhanced one.
Clause 109 explains the meaning of references to “the 2018 Act” and “the UK GDPR” in the Bill. Such provision is needed to explain the meaning of those two references. Clause 110 authorises expenditure arising from the Bill. That provision is needed to confirm that Parliament will fund any expenditure incurred under the Bill by the Secretary of State, the Treasury or a Government Department. It requires a money resolution and a Ways and Means resolution, both of which were passed in the House of Commons on 17 April.
Clause 111 outlines the territorial extent of the Bill. Specifically, the clause states that the Bill extends to England and Wales, Scotland and Northern Ireland, with some exceptions. Much of the Bill, including everything on data protection, is reserved policy. In areas where the Bill legislates on devolved matters, we are working with the devolved Administrations to secure legislative consent motions. Clause 112 gives the Secretary of State a regulation-making power to bring the Bill’s provisions into force. Some provisions, listed in subsection (2), come into force on the date of Royal Assent. Other provisions, listed in subsection (3), come into force two months after Royal Assent. Such provision is needed to outline when the Bill’s provisions will come into force.
Clause 113 gives the Secretary of State a regulation-making power to make transitional, transitory or saving provisions that may be needed in connection with any of the Bill’s provisions coming into force. For example, provision might be required to clarify that the Information Commissioner’s new power to refuse to act on complaints will not apply where such complaints have already been made prior to commencement of the relevant provision. Clause 114 outlines the short title of the Bill. That provision is needed to confirm the title once the Bill has been enacted. I commend clauses 107 to 114 to the Committee.
The clauses set out the final technical provisions necessary in order for the Bill to be passed and enacted effectively, and for the most part are standard. I will focus briefly on clause 107, however, as a number of stakeholders including the Public Law Project have expressed concern that, as a wide Henry VIII power, it may give the Secretary of State the power to make further sweeping changes to data protection law. Can the Minister provide some assurance that the clause will allow for the creation only of further provisions that are genuinely consequential to the Bill and necessary for its proper enactment?
It is my belief that this would not have been such a concern to civil society groups had there not been multiple occasions throughout the Bill when the Secretary of State made grabs for power, concentrating the ability to make further changes to data protection legislation in their own hands. I am disappointed, though of course not surprised, that the Government have not accepted any of my amendments to help to mitigate those powers with checks and balances involving the commissioner. However, keeping the clause alone in mind, I look forward to hearing from the Minister how the powers in clause 107 will be restricted and used.
We have previously debated the efficacy of the affirmative resolution procedure. I recognise that the hon. Lady is not convinced about how effective it is in terms of parliamentary scrutiny; we will beg to differ on that point. Although the power in clause 107 allows the Secretary of State to amend Acts of Parliament, I can confirm that that is just to ensure the legal clarity of the text. Without that power, data protection legislation would be harder to interpret, thereby reducing people’s understanding of the legislation and their ability to rely on the law.
Question put and agreed to.
Clause 107 accordingly ordered to stand part of the Bill.
Clause 108
Regulations
I beg to move, That the clause be read a Second time.
In order for the public to have trust in algorithmic decision making, particularly where used by the Government, they must be able to understand how and when it is being used as a basic minimum. That is something that the Government themselves previously recognised by including a proposal to make transparency reporting on the use of algorithms in decision making for public sector bodies compulsory in their “Data: a new direction” consultation. Indeed, the Government have already made good progress on bringing together a framework that will make that reporting possible. The algorithmic transparency recording standard they have built provides a decent, standardised way of recording and sharing information about how the public sector uses algorithmic tools. There is also full guidance to accompany the standard, giving public sector bodies a clear understanding of how to complete transparency reports, as well as a compilation of pilot reports that have already been published, providing a bank of examples.
However, despite that and the majority of consultation respondents agreeing with the proposed compulsory reporting for public sector bodies—citing benefits of increased trust, accountability and accessibility for the public—the Government chose not to go ahead with the legislative change. Relying on self-regulation in the early stages of the scheme is understandable, but having conducted successful pilots, from the Cabinet Office to West Midlands police, it is unclear why the Government now choose not to commit to the very standard they created. This is a clear missed opportunity, with the standard running the risk of failing altogether if there is no legislative requirement to use it.
As the use of such algorithms grows, particularly considering further changes contained in clause 11, transparency around Government use of big data and automated decision-making tools will only increase in importance and value—people have a right to know how they are being governed. As the Public Law Project argues, transparency also has a consequential value; it facilitates democratic consensus building about the appropriate use of new technologies, and it allows for full accountability when things go wrong.
Currently, in place of that accountability, the Public Law Project has put together its own register called “Tracking Automated Government”, or TAG. Using mostly freedom of information requests, the register tracks the use of 42 algorithmic tools and rates their transparency. Of the 42, just one ranked as having high transparency. Among those with low transparency are asylum estates analysis, used to help the Home Office decide where asylum interviews should take place, given the geographical distribution of asylum seekers across the asylum estate; the general matching service and fraud referral and intervention management system, used as part of the efforts of the Department for Work and Pensions to combat benefit fraud and error—for example, by identifying claimants who may potentially have undisclosed capital or other income; and housing management systems, such as that in Wigan Metropolitan Borough Council, which uses a points-based system to prioritise social housing waiting lists.
We all want to see Government modernising and using new technology to increase efficiency and outcomes, but if an algorithmic tool impacts our asylum applications, our benefits system and the ability of people to gain housing, the people affected by those decisions deserve at the very least to know how they are being made. If the public sector sets the right example, private companies may choose to follow in the future, helping to improve transparency even further. The framework is ready to go and the benefits are clear; the amendment would simply make progress certain by bringing it forward as part of the legislative agenda. It is time that we gave people the confidence in public use of algorithms that they deserve.
I thank the hon. Member for Barnsley East for moving new clause 9. We completely share her wish to ensure that Government and public authorities provide transparency in the way they use algorithmic tools that process personal data, especially when they are used to make decisions affecting members of the public.
The Government have made it our priority to ensure that transparency is being provided through the publication of the algorithmic transparency recording standard. That has been developed to assist public sector organisations in documenting and communicating their use of algorithms in decision making that impacts members of the public. The focus of the standard is to provide explanations of the decisions taken using automated processing of data by an algorithmic system, rather than all data processing.
The standard has been endorsed by the Government’s Data Standards Authority, which recommends the standards, guidance and other resources that Government Departments should follow when working on data projects. Publishing the standard fulfils commitments made in both the national data strategy 2020 and the national artificial intelligence strategy. Since its publication, the standard has been piloted with a variety of public sector organisations across the UK, and the published records can be openly accessed via gov.uk. It is currently being rolled out more widely across the public sector.
Although the Government have made it a priority to advance work on algorithmic transparency, the algorithmic transparency recording standard is still a maturing standard that is being progressively promoted and adopted. It is evolving alongside policy thinking and Government understanding of the complexities, scope and risks around its use. We believe that enshrining the standard into law at this point of maturity could hinder the ability to ensure that it remains relevant in a rapidly developing technology field.
Therefore, although the Government sympathise with the intention behind the new clause, we believe it is best to continue with the current roll-out across the public sector. We remain committed to advancing algorithmic transparency, but we do not intend to take forward legislative change at this time. For that reason, I am unable to accept the new clause as proposed by the Opposition.
I am grateful to the Minister, but I am still confused about why, having developed the standard, the Government are not keen to put it into practice and into law. He just said that he wants to keep it relevant; he could use some of the secondary legislation that he is particularly keen on if he accepted the new clause. As I outlined, this issue has real-life consequences, whether for housing, asylum or benefits. In my constituency, many young people were affected by the exam algorithm scandal. For those reasons, I would like to push the new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
Overall, the aim of the GDPR is to ensure the effective and complete protection of data subjects. That protection cannot be considered effective or complete if people cannot seek justice, remedy and repair if an organisation processes personal data unlawfully. Therefore, there must be suitable methods of redress for all data and decision subjects in any suitable data protection regime. Bringing any kind of legal case is not something people take lightly. Cases can be lengthy, costly and, in many lower-level cases, seem disproportionate to the loss suffered or remedy available. That is no different in cases surrounding the misuse of personal data.
As the law stands, article 80(1) of the EU GDPR has been implemented in the UK, meaning a data subject has the right to mandate a not-for-profit body or organisation to lodge a complaint on their behalf. That means, for example, a charity can help an individual to bring forward a case where they have been materially impacted by a data breach. Such provisions help to ensure that those who have suffered an infringement can be supported in lodging a claim, and are not disincentivised by a lack of understanding, resources or cost. However, the UK has not yet adopted article 80(2), which goes one step further, allowing those same organisations to lodge a complaint independently of a data subject’s mandate.
I am grateful to the hon. Lady for setting out the purposes of the new clause. As she has described, it aims to require the Secretary of State to use regulation-making powers under section 190 of the Data Protection Act to implement article 80(2) of the UK GDPR. It would enable non-profit organisations with an expertise in data protection law to make complaints to the Information Commissioner and/or take legal action against data controllers without the specific authorisation of the individuals who have been affected by data breaches. Relevant non-profit organisations can already take such actions on behalf of individuals who have specifically authorised them to do so under provisions in article 80(1) of the UK GDPR.
In effect, the amendment would replace the current discretionary powers in section 190 of the Data Protection Act with a duty for the Secretary of State to legislate to bring those provisions into force soon after the Bill has received Royal Assent. Such an amendment would be undesirable for a number of reasons. First, as required under section 189 of the Data Protection Act, we have already consulted and reported to Parliament on proposals of that nature, and we concluded that there was not a strong enough case for introducing new legislation.
Although the Government’s report acknowledged that some groups in society might find it difficult to complain to the ICO or bring legal proceedings of their own accord, it pointed out that the regulator can and does investigate complaints raised by civil society groups even when they are not made on behalf of named individuals. Big Brother Watch’s recent complaints about the use of live facial recognition technology in certain shops in the south of England is an example of that.
Secondly, the response concluded that giving non-profit organisations the right to bring compensation claims against data controllers on behalf of individuals who had not authorised them to do so could prompt the growth of US-style lawsuits on behalf of thousands or even millions of customers at a time. In the event of a successful claim, each individual affected by the alleged breach could be eligible for a very small payout, but the consequences for the businesses could be hugely damaging, particularly in cases that involved little tangible harm to individuals.
Some organisations could be forced out of business or prompted to increase prices to recoup costs. The increase in litigation costs could also increase insurance premiums. A hardening in the insurance market could affect all data controllers, including those with a good record of compliance. For those reasons, we do not believe that it is right to extend the requirement on the Secretary of State to allow individuals to bring actions without the consent of those affected. On that basis, I ask the hon. Lady to withdraw the motion.
Data is increasingly used to make decisions about us as a collective, so it is important that GDPR gives us collective rights to reflect that, rather than the system being designed only for individuals to seek redress. For those reasons, I will press my new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
Privacy enhancing technologies are technologies and techniques that can help organisations to share and use people’s data responsibly, lawfully and securely. They work most often by minimising the amount of data used, maximising data security—for example by encrypting or anonymising personal information—or empowering individuals. One of the best-known examples of a PET is synthetic data: data that is modelled to reproduce the statistical properties of a real dataset when taken as a whole. That type of data could allow third-party researchers or processors to analyse the statistical outcomes of the data without having access to the original set of personal data, or any information about identifiable living individuals.
Another example of PETs are those that minimise the amount of personal data that is shared without affecting the data’s utility. Federated learning, for example, allows for the training of an algorithm across multiple devices or datasets held on servers, so if an organisation wants to train a machine-learning model but has limited training data available, they can send the model to a remote dataset for training. The model will then return having benefited from those datasets, while the sensitive data itself is not exchanged or ever put in the hands of those in ownership of the algorithm. The use of PETs therefore does not necessarily exclude data from being defined as personal or falling within the remit of GDPR. They can, however, help to minimise the risk that arises from personal data breaches and provide an increased level of security.
The Government have positioned the Bill as one that seeks to strengthen the data rights of citizens while catalysing innovation. PETs could and should have been a natural area for the Bill to explore, because not only can such devices help controllers demonstrate an approach based on data protection by design and default, but they can open the door for new ways of collaborating, innovating and researching with data. The Royal Society has researched the role that PETs can play in data governance and collaboration in immense detail, with its findings contained in its 2023 report, which is more than 100 pages long. One of the report’s key recommendations was that the Government should develop a national PET strategy to promote their responsible use as tools for advancing scientific research, increasing security and offering new partnership possibilities, both domestically and across borders.
It is vital to acknowledge that working with PETs involves risks that must be considered. Some may not be robust enough against attacks because they are in the early stages of development, while others might require a significant amount of expertise to operate, without which their use may be counterproductive. It is therefore important to be clear that the amendment would not jump ahead and endorse any particular technology or device before it was ready. Instead, it would enshrine the European Union Agency for Cybersecurity definition of PETs in UK law and prompt the Government to issue a report on how that growing area of technology might play a role in data processing and data regulation in future.
That could include identifying the opportunities that PETs could provide while also looking at the threats and potential harms involved in using the technologies without significant expertise or technological readiness. Indeed, in their consultation response, the Government even mentioned they were keen to explore opportunities around smart data, while promoting understanding that they should not be seen as a substitute for reducing privacy risks on an organisational level. The report, and the advancing of the amendment, would allow the Government that exploration, indicating a positive acknowledgment of the potentially growing role that PETs might play in data processing and opening the door for further research in the area.
Even by their name, privacy enhancing technologies reflect exactly what the Bill should be doing: looking to the future to encourage innovation in tech and then using such innovation to protect citizens in return. I hope hon. Members will see those technologies’ potential value and the importance of analysing any harms, and look to place the requirement to analyse PETs on the statute book.
We absolutely agree with the Opposition about the importance of privacy enhancing technologies, which I will call PETs, since I spoke on them recently and was told that was the best abbreviation—it is certainly easier. We wish to see their use by organisations to help ensure compliance with data protection principles and we seek to encourage that. As part of our work under the national data strategy, we are already exploring the macro-impacts of PETs and how they can unlock data across the economy.
The ICO has recently published its draft guidance on anonymisation, pseudonymisation and PETs, which explains the benefits and different types of PETs currently available, as well as how they can help organisations comply with data protection law. In addition, the Centre for Data Ethics and Innovation has published an adoption guide to aid decision making around the use of PETs in data-driven projects. It has also successfully completed delivery of UK-US prize challenges to drive innovation in PETs that reinforce democratic values. Indeed, I was delighted to meet some of the participants in those prize challenges at the Royal Society yesterday and hear a little more about some of their remarkable innovations.
As the hon. Lady mentioned, the Royal Society has published reports on how PETs can maximise the benefit and reduce the harms associated with data use. Adding a definition of PETs to the legislation and requiring the Government to publish a report six months after Royal Assent is unlikely to have many advantages over the approach that the ICO, the CDEI and others are taking to develop a better understanding in the area. Furthermore, many PETs are still in the very early stages of their deployment and use, and have not been widely adopted across the UK or globally. A statutory definition could quickly become outdated. Publishing a comprehensive report on the potential impacts of PETs, which advocated the use of one technology or another, could even distort a developing market, and lead to unintended negative impacts on the development of what are promising technologies. For that reason, I ask the hon. Lady to withdraw the new clause.
I am grateful to the Minister for his clarification on the pronunciation of the acronym. I acknowledge the points he made. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 13
Oversight of biometric technology use by the Information Commission
‘(1) The Information Commission must establish a Biometrics Office.
(2) The Biometrics Office is to consist of a committee of three commissioners with relevant expertise, appointed by the Commission.
(3) The functions of the Biometrics Office are—
(a) to establish and maintain a public register of relevant entities engaged in processing biometric data;
(b) to oversee and review the biometrics use of relevant entities;
(c) to produce a Code of Practice for the use of biometric technology by registered parties, which must include—
(i) compulsory standards of accuracy and reliability for biometric technologies,
(ii) a requirement for the proportionality of biometrics use to be assessed prior to use and annually thereafter, and a procedure for such assessment, and
(iii) a procedure for individual complaints about the use of biometrics by registered parties;
(d) to receive and publish annual reports from all relevant entities, which must include the relevant entity’s proportionality assessment of their biometrics use;
(e) to enforce registration and reporting by the issuing of enforcement notices and, where necessary, the imposition of fines for non-compliance with the registration and reporting requirements;
(f) to ensure lawfulness of biometrics use by relevant entities, including issuing compliance and abatement notices where necessary.
(4) The Secretary of State may by regulations add to the responsibilities of the Biometrics Office.
(5) Regulations made under subsection (4) are subject to the affirmative resolution procedure.
(6) For the purposes of this Part—
“biometric data” has the meaning given by section 106 of this Act (see subsection 13);
“relevant entity” means any organisation or body corporate (whether public or private) which processes biometric data, other than where the biometric processing undertaken by the organisation or body corporate is otherwise overseen by the Investigatory Powers Commissioner, because it is—
(a) for the purposes of making or renewing a national security determination as defined by s.20(2) Protection of Freedoms Act 2012; or
(b) for the purposes set out in s.20(6) Protection of Freedoms Act 2012.’.—(Stephanie Peacock.)
This new clause, together with NC14 and NC15, are intended to form a new Part of the Bill which creates a mechanism for the Information Commission to oversee biometric technology use by private parties.
Brought up, and read the First time.
With this it will be convenient to discuss the following:
New clause 14—Requirement to register with the Information Commission—
‘(1) Any relevant entity intending to process biometric data for purposes other than those contained in section 20(2) and section 20(6) of the Protection of Freedoms Act 2012 must register with the Information Commission prior to the deployment of the biometric technology.
(2) An application for registration must include an explanation of the intended biometrics use, including an assessment of its proportionality and its extent.
(3) All relevant entities must provide an annual report to the Biometrics Office addressing their processing of biometric data in the preceding year and their intended processing of biometrics in the following year .
(4) Each annual report must contain a proportionality assessment of the relevant entity’s processing of biometric data in the preceding year and intended processing of biometric data in the following year.
(5) Any relevant entity which processes biometric data without having registered with the Information Commission, or without providing annual reports to the Biometrics Office, is liable to an unlimited fine imposed by the Information Commission.’
See explanatory statement to NC13.
New clause 15—Private biometrics use prior to entry into force of the Act—
‘Any relevant entity engaged in processing biometric data other than for the purposes contained in section 20(2) and section 20(6) of the Protection of Freedoms Act 2012 prior to the entry into force of this Part must register with the Information Commission in accordance with section [Requirement to register with the Information Commission] within six months of the date of entry into force of this Part; and subsection (5) of that section does not apply to such an entity during that period.’
See explanatory statement to NC13. This new clause would provide a transitional period of six months for entities which were already engaged in the processing of biometric data to register with the Commission.
A wider range of biometric data is now being collected than ever before. From data on the way we walk and talk to the facial expressions we make, biometric data is now being collected and used in a wide range of situations for many distinct purposes. Great attention has rightly been paid to police use of facial recognition technology to identify individuals, for example at football matches or protests. Indeed, to date, much of the regulatory attention has focused on those use cases, which are overseen by the Investigatory Powers Commissioner. However, the use of biometric technologies extends far beyond those examples, and there has been a proliferation of biometrics designed by private organisations to be used across day-to-day life—not just in policing.
We unlock smartphones with our faces or fingerprints, and companies have proposed using facial expression analysis to detect whether students are paying attention in online classes. Employers have used facial expression and tone analysis to decide who should be selected for a job—as was already mentioned in reference to new clause 8. As the proliferation of biometric technologies occurs, a number of issues have been raised about their impact on people and society. Indeed, if people’s identities can be detected by both public and private actors at any given point, there is potential for it to significantly infringe on someone’s privacy to move through the world with freedom of expression, association and assembly. Similarly, if people’s traits, characteristics or abilities can be automatically assessed on the basis of biometrics, often without a scientific basis, it may affect free expression and the development of personality.
Public attitudes research carried out by the Ada Lovelace Institute shows that the British public recognise the potential benefits of tools such as facial recognition in certain circumstances—for example, smartphone locking systems and in airports—but often reject their use in others. Large majorities are opposed to the use of facial recognition in shops, schools and on public transport, as well as by human resources departments in recruitment. In all cases, the public expect the use of biometrics to be accompanied by safeguards and limitations, such as appropriate transparency and accountability measures.
Members of the citizens’ biometrics council, convened by the Ada Lovelace Institute in 2020 and made up of 50 members of the public, expressed the view that biometric technologies as currently used are lacking in transparency and accountability. In particular, safeguards are uneven across sectors. Private use of biometrics are not currently subject to the same level of regulatory oversight or due process as is afforded within the criminal justice system, despite also having the potential to create changes of life-affecting significance. As a result, one member of the council memorably asked:
“If the technology companies break their promises…what will the implications be? Who’s going to hold them to account?”
It is with those issues in mind that experts and legal opinion seem all to come to the same consistent conclusion that, at the moment, there is not a sufficient legal framework in place to manage the unique issues that the private proliferation of biometrics use raises. An independent legal review, commissioned by the Ada Lovelace Institute and led by Matthew Ryder KC, found that current governance structures and accountability mechanisms for biometrics are fragmented, unclear and ineffective. Similar findings have been made by the Biometrics and Surveillance Camera Commissioner, and Select Committees in this House and in the other place.
The Government, however, have not yet acted on delivering a legal framework to govern the use of biometric technology by private corporations, meaning that the Bill is a missed opportunity. New clause 13 therefore seeks to move towards the creation of that framework, providing for the Information Commission to oversee the use of biometric technology by private parties, and ensure accountability around it. I hope that the Committee see the value of this oversight and what it could provide and will support the new clause.
New clause 13 would require the Information Commission to establish a new separate statutory biometrics office with responsibility for the oversight and regulation of biometric data and technology. However, the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, as it falls within the definition of personal data. Under the Bill, the new body corporate—the Information Commission—will continue to monitor and enforce the processing of all personal data under the data protection legislation, including biometric data. Indeed, with its new independent board and governance structure, the commission will enjoy greater diversity in skills and decision making, ensuring that the regulator has the right blend of skills and expertise at the very top of the organisation.
Furthermore, the Bill allows the new Information Commission to establish committees, which may include specialists from outside the organisation with key skills and expertise in specialist areas. As such, the Government are of the firm view that the Information Commission is best placed to provide regulatory oversight of biometric data, rather than delegating responsibility and functions to a separate office. The creation of a new body would likely cause confusion for those seeking redress, by creating novel complaints processes for biometric-related complaints, as set out in new clause 13(3)(c)(iii). It would also complicate regulatory oversight and decision making by providing the new office with powers to impose fines, as per subsection (2)(e). For those reasons, I encourage the hon. Lady to withdraw her new clause.
New clauses 14 and 15 would require non-law enforcement bodies that process biometric data about individuals to register with the Information Commissioner before the processing begins. Where the processing started prior to passage of the Bill, the organisation would need to register within six months of commencement. As part of the registration process, the organisation would have to explain the intended effect of the processing and provide annual updates to the Information Commissioner’s Office on current and future processing activities. Organisations that fail to comply with these requirements would be subject to an unlimited fine.
I appreciate that the new clauses aim to make sure that organisations will give careful thought to the necessity and proportionality of their processing activities, and to improve regulatory oversight, but they could have significant unintended consequences. As the hon. Lady will be aware, there are many everyday uses of biometrics data, such as using a thumbprint to access a phone, laptop or other connected device. Such services would always ask for the user’s explicit consent and make alternatives such as passwords available to customers who would prefer not to part with their biometric data.
If every organisation that launched a new product had to register with the Information Commissioner to explain its intentions and complete annual reports, that could place significant and unnecessary new burdens on businesses and undermine the aims of the Bill. Where the use of biometric data is more intrusive, perhaps involving surveillance technology to identify specific individuals, the processing will already be subject to the heightened safeguards in article 9 of the UK GDPR. The processing would need to be necessary and proportionate on the grounds of substantial public interest.
The Bill will also require organisations to designate a senior responsible individual to manage privacy risks, act as a contact point for the regulator, undertake risk assessments and keep records in relation to high-risk processing activities. It would be open to the regulator to request to see these documents if members of the public expressed concern about the use of the technology.
I hope my response has helped to address the issues the hon. Lady was concerned about, and I would respectfully ask her to not to press these new clauses.
It does indeed provide reassurance. On that basis, I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
We now come to the big moment for the hon. Member for Loughborough. Weeks of anticipation are now at an end. I call her to move new clause 16.
New Clause 16
Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision
‘(1) The 2018 Act is amended in accordance with subsection (2).
(2) In the 2018 Act, after section 40 insert—
“40A Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision
(1) This section applies to a set of processing operations consisting of the preparation of a case-file by the police service for submission to the Crown Prosecution Service for a charging decision, the making of a charging decision by the Crown Prosecution Service, and the return of the case-file by the Crown Prosecution Service to the police service after a charging decision has been made.
(2) The police service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in preparing a case-file for submission to the Crown Prosecution Service for a charging decision.
(3) The Crown Prosecution Service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in making a charging decision on a case-file submitted for that purpose by the police service.
(4) If the Crown Prosecution Service decides that a charge will not be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(5) If the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must return the case-file to the police service and take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(6) Where the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service and returns the case-file to the police service under subsection (5), the police service must comply with the first data protection principle and the third data protection principle in relation to any subsequent processing of the data contained in the case-file.
(7) For the purposes of this section—
(a) The police service means—
(i) constabulary maintained by virtue of an enactment, or
(ii) subject to section 126 of the Criminal Justice and Public Order Act 1994 (prison staff not to be regarded as in police service), any other service whose members have the powers or privileges of a constable.
(b) The preparation of, or preparing, a case-file by the police service for submission to the Crown Prosecution Service for a charging decision includes the submission of the file.
(c) A case-file includes all information obtained by the police service for the purpose of preparing a case-file for submission to the Crown Prosecution Service for a charging decision.”’ —(Jane Hunt.)
This new clause adjusts Section 40 of the Data Protection Act 2018 to exempt the police service and the Crown Prosecution Service from the first and third data protection principles contained within the 2018 Act so that they can share unredacted data with one another when making a charging decision.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
It is a pleasure to speak before you today, Mr Hollobone, and to move my new clause. I recently met members of the Leicestershire Police Federation, who informed me of its concerns regarding part 3 of the Data Protection Act 2018, which imposes unnecessary and burdensome redaction obligations on the police and taking them away from the frontline. I thank the Police Federation for providing me with the information I am going to discuss and for drafting the new clause I have tabled.
Part 3 of the 2018 Act implemented the law enforcement directive and made provision for data processing by competent authorities, including police forces and the Crown Prosecution Service, for “law enforcement purposes”.
Although recital (4) to the law enforcement directive emphasised that the
“free flow of personal data between competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences…should be facilitated while ensuring a high level of protection of personal data,”
part 3 of the 2018 Act contains no provision at all to facilitate the free flow of personal data between the police and the CPS. Instead, it imposes burdensome obligations on the police, requiring them to redact personal data from information transferred to the CPS. Those obligations are only delaying and obstructing the expeditious progress of the criminal justice system and were not even mandated by the law enforcement directive.
The problem has arisen due to chapter 2 of part 3 of the 2018 Act, which sets out six data protection principles that, as I have mentioned, apply to data processing by competent authorities for law enforcement purposes. Section 35(1) states:
“The first data protection principle is that the processing of personal data for any of the law enforcement purposes must be lawful and fair.”
Section 35(2) states:
“The processing of personal data for any of the law enforcement purposes is lawful only if and to the extent that it is based on law and either—
(a) the data subject has given consent to the processing for that purpose, or
(b) the processing is necessary for the performance of a task carried out for that purpose by a competent authority.”
The Police Federation has said that it is very unlikely that section 35(2)(a) will apply in this context. It has also said that, in the case of section 35(2)(b), the test of whether the processing is “necessary” is exacting, requiring a competent authority to apply its mind to the proportionality of processing specific items of personal data for the particular law enforcement purpose in question. Under sections 35(3) to (5), where the processing is “sensitive processing”, an even more rigorous test applies, requiring among other things that the processing is “strictly necessary” for the law enforcement purpose in question. Section 37 goes on to state:
“The third data protection principle is that personal data processed for any of the law enforcement purposes must be adequate, relevant and not excessive in relation to the purpose for which it is processed.”
For the purposes of the 2018 Act, the CPS and each police force are separate competent authorities and separate data controllers. Therefore, as set out in section 34(3), the CPS and each police force must comply with the data protection principles. A transfer of information by a police force to the CPS amounts to the processing of personal data.
The tests of “necessary” and “strictly necessary” under the first data protection principle and the third data protection principle require a competent authority to identify and consider each and every item of personal data contained within information that it is intending to process, and to consider whether it is necessary for that item of personal data to be processed in the manner intended.
The Police Federation has explained that, when the police prepare a case file for submission to the CPS for a charging decision, the practical effect is that they have to spend huge amounts of time and resources on doing so. They go through the information that has been gathered by investigating officers in order to identify every single item of personal data contained in that information; decide whether it is necessary—or, in many cases, strictly necessary—for the CPS to consider each item of personal data when making its charging decision; and redact every item of personal data that does not meet that test.
New clause 16 would amend section 40 of the Data Protection Act 2018, allowing police services to share unredacted data with the Crown Prosecution Service when it is making a charging decision. I am incredibly sympathetic to the aim that the hon. Member for Loughborough has set out, which is to get the police fighting crime on the frontline as much as possible. In oral evidence, Aimee Reed, director of data at the Metropolitan police, said that if the police could share information redacted before charging decisions were made, it would be “of considerable benefit”. She said that that would
“enable better and easier charging decisions”
and
“reduce the current burden on officers”––[Official Report, Data Protection and Digital Information (No. 2) Public Bill Committee, 10 May 2023; c. 58, Q126.]
That would allow them to focus their time on other things. It is therefore good to see that concept being explored in a new clause.
To determine the value of the change, we would like to see a full impact assessment of the potential risks and harms associated with it. I hope that that could be conducted with the intention of weighing the change against the actual cost of the current burden that police face in redacting data. Without such an assessment, it is hard to determine whether the benefit to the police would be proportionate to the impact or harms that might occur as a result of the change, particularly for the subjects of data involved. That is not to say that any change would not be beneficial, but perhaps more detail could be explored with regard to the proposal.
As I believe that this is the final time that I will speak in this Committee, may I say a few words of thanks?
Okay, I will wait for the next Question. Thank you for your guidance, Mr Hollobone.
I thank my hon. Friend the Member for Loughborough, who has been assiduous in pursuing her point and has set out very clearly the purpose of her new clause. We share her wish to reduce unnecessary burdens on the police as much as possible. The new clause seeks to achieve that in relation to the preparation by police officers of pre-charge files, which is an issue that the National Police Chiefs’ Council has raised with the Home Office, as I think she knows.
This is a serious matter for our police forces, which estimate that about four hours is spent redacting a typical case file. They argue that reducing that burden would enable officers to spend more time on frontline policing. We completely understand the frustration that many officers feel about having to spend a huge amount of time on what they see as unnecessary redaction. I can assure my hon. Friend that the Home Office is working with partners in the criminal justice system to find ways of safely reducing the redaction burden while maintaining public trust. It is important that we give them the time to do so.
We need to resolve the issue through an evidence-based solution that will ensure that the right amount of redaction is done at the right point in the process, so as to reduce any delays while maintaining victim and witness confidence in the process. I assure my hon. Friend that her point is very well taken on board and the Government are looking at how we can achieve her objective as quickly as possible, but I hope she will accept that, at this point, it would be sensible to withdraw her new clause.
I thank the Minister greatly for what he has said, and for the time and effort that is being put in by several Departments to draw attention to the issue and bring it to a conclusion. I am happy that some progress has been made and, although I reserve my right to bring back the new clause at a later date, I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
Hon. Members will be disappointed to hear that we have reached the final Question that I must put to the Committee.
Question proposed, That the Chair do report the Bill, as amended, to the House.
It has been a real pleasure to represent His Majesty’s loyal Opposition in the scrutiny of the Bill. I thank the Minister for his courteous manner, all members of the Committee for their time, the Clerks for their work and the many stakeholders who have contributed their time, input and views. I conclude by thanking Anna Clingan, my senior researcher, who has done a remarkable amount of work to prepare for our scrutiny of this incredibly complex Bill. Finally, I thank you, Mr Hollobone, for the way in which you have chaired the Committee.
May I join the hon. Lady in expressing thanks to you, Mr Hollobone, and to Mr Paisley for chairing the Bill Committee so efficiently and getting us to this point ahead of schedule? I thank all members of the Committee for their participation: we have been involved in what will be seen to be a very important piece of legislation.
I am very grateful to the Opposition for their support in principle for many of the objectives of the Bill. It is absolutely right that the Opposition scrutinise the detail, and the hon. Member for Barnsley East and her colleagues have done so very effectively. I am pleased that we have reached this point with the Bill so far unamended, but obviously we will be considering it further on Report.
I thank all my hon. Friends for attending the Committee and for their contributions, particularly saying “Aye” at the appropriate moments, which has allowed us to get to this point. I also thank the officials in the Department for Science, Innovation and Technology. I picked up this baton on day two of my new role covering the maternity leave of my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez); I did so with some trepidation, but the officials have made my task considerably easier and I am hugely indebted to them.
I thank everybody for allowing us to get this point. I look forward to further debate on Report, in due course.
May I thank all hon. Members for their forbearance during the passage of the Bill and thank all the officers of the House for their diligence and attention to duty? My one remaining humble observation is that if the day ever comes when a facial recognition algorithm is attached to the cameras in the main Chamber to assess whether Members are bored or not paying attention, we will all be in very big trouble.
Question put and agreed to.
Bill, as amended, accordingly to be reported.
(1 year ago)
Commons ChamberMr Speaker has selected the recommittal motion in the name of Sir Chris Bryant. I call him to move the motion.
I beg to move,
That the Bill be re-committed to a Public Bill Committee.
First, I wish to briefly refer to the death yesterday morning of my predecessor as Member of Parliament for Rhondda, Allan Rogers. I know that many Members found him a good colleague to work with, and I believe that he spent many hours on the Channel Tunnel Act 1987. I sometimes think that the people who do such Bills on behalf of all of us deserve a medal. I am sure the whole House sends its best regards and deepest condolences to his family.
Our core job as Members of Parliament is the scrutiny of legislation, teasing out whether a proposal will do what it says, whether it is necessary and proportionate, and whether it has public support. The Government have had total control of the Order Paper since 1902, so we can do that job properly only if the Government get their act together and play ball. That is what enables the line-by-line consideration of the laws that bind us. It is what makes us a functioning democracy. We need to send the Bill back to Committee because we simply cannot do that job properly today.
Let us recall how we got here. A first version of the Data Protection and Digital Information Bill was introduced by the previous Member for Mid Bedfordshire on 18 July 2022. It was such a mess that it never even made it to Second Reading. Nadine Dorries was sacked in September last year, and six months later the Bill was sacked as well, to be replaced by a new and improved No. 2 Bill, which had its Second Reading on 17 April and completed its Committee stage on 24 May. That was 190 days ago.
I do not know what has prompted all the delay. Was it the general chaos in Government? Perhaps the Government do not fully understand the term “with immediate effect”. I like the Minister, and I have known and worked with him on many different issues for many years. I had a meeting with him and his officials on Thursday 16 November. He told me then that on Report the Government would table only a few minor and technical amendments to the Bill, which he hoped everyone would be able to agree fairly easily.
On the last available day, 182 days after Committee, the Government brought out 240 amendments. Some are indeed minor and technical, but many are very significant. They strike to the heart of the independence of the new Information Commission, they alter the rights of the public in making subject access requests, and they amend our system in a way that may or may not enhance our data adequacy with the US and the European Union and therefore British businesses’ ability to rely on UK legislation to trade overseas. In some instances, they give very extensive new powers to Ministers, and they introduce completely new topics that have never been previously mooted, debated or scrutinised by Parliament in relation to this Bill, which already has more baubles on it than the proverbial Christmas tree. The end result is that we have 156 pages of amendments to consider today in a single debate.
Yes, we could have tabled amendments to the Government amendments, but they would not have been selectable, and we would not have been able to vote on them. So the way the Government have acted, whether knowingly, recklessly or incompetently, means that the Commons cannot carry out line-by-line consideration of what will amount to more than 90 pages of new laws, 38 new clauses and two new schedules, one of which is 19 pages long. Some measures will barely get a minute’s consideration today. That is not scrutiny; it is a blank cheque.
Yesterday, I made a generous offer to the Minister for Disabled People, Health and Work, the hon. Member for Corby (Tom Pursglove), who is sitting on the Front Bench and whom I also like. We recognise that some of the issues need to be addressed, so we said: “Recommit the Bill so we can help you get this right in the Commons, and we will commit to have it out of Committee in a fortnight. It could go to the Lords with all parties’ support by Christmas.”
Let me repeat: this is no way to scrutinise a Bill, particularly one that gives the Government sweeping powers and limits the rights of our fellow citizens, the public. Sadly, it is part of a growing trend, but “legislate at speed, repent at leisure” should not be our motto. Some will say something that is commonly said these days: “Let it go through to the Lords so they can amend it.” But I am sick of abdicating responsibility for getting legislation right. It is our responsibility. We should not send Bills through that are, at best, half-considered. We are the elected representatives. We cannot just pass the parcel to the Lords. We need to do our job properly. We cannot do that today without recommitting the Bill.
I begin by joining the hon. Member for Rhondda (Sir Chris Bryant) in expressing the condolences of the House to his predecessor, Allan Rogers. He served as a Member of Parliament during my first nine years in this place. I remember him as an assiduous constituency Member of Parliament, and I am sure we all share the sentiments expressed by the hon. Gentleman.
It is a pleasure to return to the Dispatch Box to lead the House through Report stage of the Bill. We spent considerable time discussing it in Committee, but the hon. Gentleman was not in his post at that time. I welcome him to his position. He may regret that he missed out on Committee stage, which makes him keen to return to it today.
The Bill is an essential piece of legislation that will update the UK’s data laws, making them among the most effective in the world. We scrutinised it in depth in Committee. The hon. Gentleman is right that the Government have tabled a number of amendments for the House to consider today, and he has done the same. The vast majority are technical, and the number sounds large because a lot are consequential on original amendments. One or two address new aspects, and I will be happy to speak to those as we go through them during this afternoon’s debate. Nevertheless, they represent important additions to the Bill.
The Minister for Disabled People, Health and Work, my hon. Friend the Member for Corby (Tom Pursglove), who is sitting next to me, has drawn the House’s attention to the fact that amending the Bill to allow the Department for Work and Pensions access to financial data will make a significant contribution to identifying fraud. I would have thought that the Opposition would welcome that. It is not a new measure; it was contained in the fraud plan that the Government published back in May 2022. The Government have been examining that measure, and we have always made it clear that we would bring it forward at an appropriate parliamentary time when a vehicle was available. This is a data Bill, and the measure is specific to it. We estimate that it will result in a saving to the taxpayer of around £500 million by the end of 2028-29. I am surprised that the Opposition should question that.
As I said, the Bill has been considered at length in Committee. It is important that we consider it on Report, in order that it achieve the next stage of its progress through Parliament. On that basis, I reject the motion.
Question put.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 48—Processing of personal data revealing political opinions.
Government new clause 7—Searches in response to data subjects’ requests.
Government new clause 8—Notices from the Information Commissioner.
Government new clause 9—Court procedure in connection with subject access requests.
Government new clause 10—Approval of a supplementary code.
Government new clause 11—Designation of a supplementary code.
Government new clause 12—List of recognised supplementary codes.
Government new clause 13—Change to conditions for approval or designation.
Government new clause 14—Revision of a recognised supplementary code.
Government new clause 15—Applications for approval and re-approval.
Government new clause 16—Fees for approval, re-approval and continued approval.
Government new clause 17—Request for withdrawal of approval.
Government new clause 18—Removal of designation.
Government new clause 19—Registration of additional services.
Government new clause 20—Supplementary notes.
Government new clause 21—Addition of services to supplementary notes.
Government new clause 22—Duty to remove services from the DVS register.
Government new clause 23—Duty to remove supplementary notes from the DVS register.
Government new clause 24—Duty to remove services from supplementary notes.
Government new clause 25—Index of defined terms for Part 2.
Government new clause 26—Powers relating to verification of identity or status.
Government new clause 27—Interface bodies.
Government new clause 28—The FCA and financial services interfaces.
Government new clause 29—The FCA and financial services interfaces: supplementary.
Government new clause 30—The FCA and financial services interfaces: penalties and levies.
Government new clause 31—Liability and damages.
Government new clause 32—Other data provision.
Government new clause 33—Duty to notify the Commissioner of personal data breach: time periods.
Government new clause 34—Power to require information for social security purposes.
Government new clause 35—Retention of information by providers of internet services in connection with death of child.
Government new clause 36—Retention of biometric data and recordable offences.
Government new clause 37—Retention of pseudonymised biometric data.
Government new clause 38—Retention of biometric data from INTERPOL.
Government new clause 39—National Underground Asset Register.
Government new clause 40—Information in relation to apparatus.
Government new clause 41—Pre-commencement consultation.
Government new clause 42—Transfer of certain functions of Secretary of State.
New clause 1—Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision—
“(1) The 2018 Act is amended in accordance with subsection (2).
(2) In the 2018 Act, after section 40 insert—
“40A Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision
(1) This section applies to a set of processing operations consisting of the preparation of a case-file by the police service for submission to the Crown Prosecution Service for a charging decision, the making of a charging decision by the Crown Prosecution Service, and the return of the case-file by the Crown Prosecution Service to the police service after a charging decision has been made.
(2) The police service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in preparing a case-file for submission to the Crown Prosecution Service for a charging decision.
(3) The Crown Prosecution Service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in making a charging decision on a case-file submitted for that purpose by the police service.
(4) If the Crown Prosecution Service decides that a charge will not be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(5) If the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must return the case-file to the police service and take all steps reasonably required to destroy and delete all copies of the case-file in its possession.
(6) Where the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service and returns the case-file to the police service under subsection (5), the police service must comply with the first data protection principle and the third data protection principle in relation to any subsequent processing of the data contained in the case-file.
(7) For the purposes of this section—
(a) The police service means—
(i) constabulary maintained by virtue of an enactment, or
(ii) subject to section 126 of the Criminal Justice and Public Order Act 1994 (prison staff not to be regarded as in police service), any other service whose members have the powers or privileges of a constable.
(b) The preparation of, or preparing, a case-file by the police service for submission to the Crown Prosecution Service for a charging decision includes the submission of the file.
(c) A case-file includes all information obtained by the police service for the purpose of preparing a case-file for submission to the Crown Prosecution Service for a charging decision.””
This new clause adjusts Section 40 of the Data Protection Act 2018 to exempt the police service and the Crown Prosecution Service from the first and third data protection principles contained within the 2018 Act so that they can share unredacted data with one another when making a charging decision.
New clause 2—Common standards and timeline for implementation—
“(1) Within one month of the passage of this Act, the Secretary of State must by regulations require those appointed as decision-makers to create, publish and update as required open and common standards for access to customer data and business data.
(2) Standards created by virtue of subsection (1) must be interoperable with those created as a consequence of Part 2 of the Retail Banking Market Investigation Order 2017, made by the Competition and Markets Authority.
(3) Regulations under section 66 and 68 must ensure interoperability of customer data and business data with standards created by virtue of subsection (1).
(4) Within one month of the passage of this Act, the Secretary of State must publish a list of the sectors to which regulations under section 66 and section 68 will apply within three years of the passage of the Act, and the date by which those regulations will take effect in each case.”
This new clause, which is intended to be placed in Part 3 (Customer data and business data) of the Bill, would require interoperability across all sectors of the economy in smart data standards, including the Open Banking standards already in effect, and the publication of a timeline for implementation.
New clause 3—Provision about representation of data subjects—
“(1) Section 190 of the Data Protection Act 2018 is amended as follows.
(2) In subsection (1), leave out “After the report under section 189(1) is laid before Parliament, the Secretary of State may” and insert “The Secretary of State must, within three months of the passage of the Data Protection and Digital Information Act 2024,”.”
This new clause would require the Secretary of State to exercise powers under s190 DPA2018 to allow organisations to raise data breach complaints on behalf of data subjects generally, in the absence of a particular subject who wishes to bring forward a claim about misuse of their own personal data.
New clause 4—Review of notification of changes of circumstances legislation—
“(1) The Secretary of State must commission a review of the operation of the Social Security (Notification of Changes of Circumstances) Regulations 2010.
(2) In conducting the review, the designated reviewer must—
(a) consider the current operation and effectiveness of the legislation;
(b) identify any gaps in its operation and provisions;
(c) consider and publish recommendations as to how the scope of the legislation could be expanded to include non-public sector, voluntary and private sector holders of personal data.
(3) In undertaking the review, the reviewer must consult—
(a) specialists in data sharing;
(b) people and organisations who campaign for the interests of people affected by the legislation;
(c) people and organisations who use the legislation;
(d) any other persons and organisations the review considers appropriate.
(4) The Secretary of State must lay a report of the review before each House of Parliament within six months of this Act coming into force.”
This new clause requires a review of the operation of the “Tell Us Once” programme, which seeks to provide simpler mechanisms for citizens to pass information regarding births and deaths to government, and consideration of whether the progress of “Tell Us Once” could be extended to non-public sector holders of data.
New clause 5—Definition of “biometric data”—
“Article 9 of the UK GDPR is amended by the omission, in paragraph 1, of the words “for the purpose of uniquely identifying a natural person”.”
This new clause would amend the UK General Data Protection Regulation to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.
New clause 43—Right to use non-digital verification services—
“(1) This section applies when an organisation—
(a) requires an individual to use a verification service, and
(b) uses a digital verification service for that purpose.
(2) The organisation—
(a) must make a non-digital alternative method of verification available to any individual required to use a verification service, and
(b) must provide information about digital and non-digital methods of verification to those individuals before verification is required.”
This new clause, which is intended for insertion into Part 2 of the Bill (Digital verification services), creates the right for data subjects to use non-digital identity verification services as an alternative to digital verification services, thereby preventing digital verification from becoming mandatory in certain settings.
New clause 44—Transfer of functions to the Investigatory Powers Commissioner’s Office—
“The functions of the Surveillance Camera Commissioner are transferred to the Investigatory Powers Commissioner.”
New clause 45—Interoperability of data and collection of comparable healthcare statistics across the UK—
“(1) The Health and Social Care Act 2012 is amended as follows.
(2) After section 250, insert the following section—
“250A Interoperability of data and collection of comparable healthcare statistics across the UK
(1) The Secretary of State must prepare and publish an information standard specifying binding data interoperability requirements which apply across the whole of the United Kingdom.
(2) An information standard prepared and published under this section—
(a) must include guidance about the implementation of the standard;
(b) may apply to any public body which exercises functions in connection with the provision of health services anywhere in the United Kingdom.
(3) A public body to which an information standard prepared and published under this section applies must have regard to the standard.
(4) The Secretary of State must report to Parliament each year on progress on the implementation of an information standard prepared in accordance with this section.
(5) For the purposes of this section—
“health services” has the same meaning as in section 250 of this Act, except that for “in England” there is substituted “anywhere in the United Kingdom”, and “the health service” in parts of the United Kingdom other than England has the meaning given by the relevant statute of that part of the United Kingdom;
“public body” has the same meaning as in section 250 of this Act.”
(3) In section 254 (Powers to direct NHS England to establish information systems), after subsection (2), insert—
“(2A) The Secretary of State must give a direction under subsection (1) directing NHS England to collect and publish information about healthcare performance and outcomes in all parts of the United Kingdom in a way which enables comparison between different parts of the United Kingdom.
(2B) Before giving a direction by virtue of subsection (2A), the Secretary of State must consult—
(a) the bodies responsible for the collection and publication of official statistics in each part of the United Kingdom,
(b) Scottish Ministers,
(c) Welsh Ministers, and
(d) Northern Ireland departments.
(2C) The Secretary of State may not give a direction by virtue of subsection (2A) unless a copy of the direction has been laid before, and approved by resolution of, both Houses of Parliament.
(2D) Scottish Ministers, Welsh Ministers and Northern Ireland departments must arrange for the information relating to the health services for which they have responsibility described in the direction given by virtue of subsection (2A) to be made available to NHS England in accordance with the direction.
(2E) For the purposes of a direction given by virtue of subsection (2A), the definition of “health and social care body” given in section 259(11) applies as if for “England” there were substituted “the United Kingdom”.””
New clause 46—Assessment of impact of Act on EU adequacy—
“(1) Within six months of the passage of this Act, the Secretary of State must carry out an assessment of the impact of the Act on EU adequacy, and lay a report of that assessment before both Houses of Parliament.
(2) The report must assess the impact on—
(a) data risk, and
(b) small and medium-sized businesses.
(3) The report must quantify the impact of the Act in financial terms.”
New clause 47—Review of the impact of the Act on anonymisation and the identifiability of data subjects—
“(1) Within six months of the passage of this Act, the Secretary of State must lay before Parliament the report of an assessment of the impact of the measures in the Act on anonymisation and the identifiability of data subjects.
(2) The report must include a comparison between the rights afforded to data subjects under this Act with those afforded to data subjects by the EU General Data Protection Regulation.”
Amendment 278, in clause 5, page 6, line 15, leave out paragraphs (b) and (c).
This amendment and Amendment 279 would remove the power for the Secretary of State to create pre-defined and pre-authorised “recognised legitimate interests”, for data processing. Instead, the current test would continue to apply in which personal data can only be processed in pursuit of a legitimate interest, as balanced with individual rights and freedoms.
Amendment 279, page 6, line 23, leave out subsections (4), (5) and (6).
See explanatory statement to Amendment 278.
Amendment 230, page 7, leave out lines 1 and 2 and insert—
“8. The Secretary of State may not make regulations under paragraph 6 unless a draft of the regulations has been laid before both Houses of Parliament for the 60-day period.
8A. The Secretary of State must consider any representations made during the 60-day period in respect of anything in the draft regulations laid under paragraph 8.
8B. If, after the end of the 60-day period, the Secretary of State wishes to proceed to make the regulations, the Secretary of State must lay before Parliament a draft of the regulations (incorporating any changes the Secretary of State considers appropriate pursuant to paragraph 8A).
8C. Draft regulations laid under paragraph 8B must, before the end of the 40-day period, have been approved by a resolution of each House of Parliament.
8D. In this Article—
“the 40-day period” means the period of 40 days beginning on the day on which the draft regulations mentioned in paragraph 8 are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid);
“the 60-day period” means the period of 60 days beginning on the day on which the draft regulations mentioned in paragraph 8B are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid).
8E. When calculating the 40-day period or the 60-day period for the purposes of paragraph 8D, ignore any period during which Parliament is dissolved or prorogued or during which both Houses are adjourned for more than 4 days.”
This amendment would make regulations made in respect of recognised legitimate interest subject to a super-affirmative Parliamentary procedure.
Amendment 11, page 7, line 12, at end insert—
““internal administrative purposes” , in relation to special category data, means the conditions set out for lawful processing in paragraph 1 of Schedule 1 of the Data Protection Act 2018.”
This amendment clarifies that the processing of special category data in employment must follow established principles for reasonable processing, as defined by paragraph 1 of Schedule 1 of the Data Protection Act 2018.
Government amendment 252.
Amendment 222, page 10, line 8, leave out clause 8.
Amendment 3, in clause 8, page 10, leave out line 31.
This amendment would mean that the resources available to the controller could not be taken into account when determining whether a request is vexatious or excessive.
Amendment 2, page 11, line 34, at end insert—
“(6A) When informing the data subject of the reasons for not taking action on the request in accordance with subsection (6), the controller must provide evidence of why the request has been treated as vexatious or excessive.”
This amendment would require the data controller to provide evidence of why a request has been considered vexatious or excessive if the controller is refusing to take action on the request.
Government amendment 17.
Amendment 223, page 15, line 22, leave out clause 10.
Amendment 224, page 18, line 7, leave out clause 12.
Amendment 236, in clause 12, page 18, line 21, at end insert—
“(c) a data subject is an identified or identifiable individual who is affected by a significant decision, irrespective of the direct presence of their personal data in the decision-making process.”
This amendment would clarify that a “data subject” includes identifiable individuals who are subject to data-based and automated decision-making, whether or not their personal data is directly present in the decision-making process.
Amendment 232, page 19, line 12, leave out “solely” and insert “predominantly”.
This amendment would mean safeguards for data subjects’ rights, freedoms and legitimate interests would have to be in place in cases where a significant decision in relation to a data subject was taken based predominantly, rather than solely, on automated processing.
Amendment 5, page 19, line 12, after “solely” insert “or partly”.
This amendment would mean that the protections provided for by the new Article 22C would apply where a decision is based either solely or partly on automated processing, not only where it is based solely on such processing.
Amendment 233, page 19, line 18, at end insert
“including the reasons for the processing.”
This amendment would require data controllers to provide the data subject with the reasons for the processing of their data in cases where a significant decision in relation to a data subject was taken based on automated processing.
Amendment 225, page 19, line 18, at end insert—
“(aa) require the controller to inform the data subject when a decision described in paragraph 1 has been taken in relation to the data subject;”.
Amendment 221, page 20, line 3, at end insert—
“7. When exercising the power to make regulations under this Article, the Secretary
of State must have regard to the following statement of principles:
Digital information principles at work
1. People should have access to a fair, inclusive and trustworthy digital environment
at work.
2. Algorithmic systems should be designed and used to achieve better outcomes:
to make work better, not worse, and not for surveillance. Workers and their
representatives should be involved in this process.
3. People should be protected from unsafe, unaccountable and ineffective
algorithmic systems at work. Impacts on individuals and groups must be assessed
in advance and monitored, with reasonable and proportionate steps taken.
4. Algorithmic systems should not harm workers’ mental or physical health, or
integrity.
5. Workers and their representatives should always know when an algorithmic
system is being used, how and why it is being used, and what impacts it may
have on them or their work.
6. Workers and their representatives should be involved in meaningful consultation
before and during use of an algorithmic system that may significantly impact
work or people.
7. Workers should have control over their own data and digital information collected
about them at work.
8. Workers and their representatives should always have an opportunity for human
contact, review and redress when an algorithmic system is used at work where
it may significantly impact work or people. This includes a right to a written
explanation when a decision is made.
9. Workers and their representatives should be able to use their data and digital
technologies for contact and association to improve work quality and conditions.
10. Workers should be supported to build the information, literacy and skills needed
to fulfil their capabilities through work transitions.”
This amendment would insert into new Article 22D of the UK GDPR a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.
Amendment 4, in clause 15, page 25, line 4, at end insert
“(including in the cases specified in sub-paragraphs (a) to (c) of paragraph 3 of Article 35)”.
This amendment, together with Amendment 1, would provide a definition of what constitutes “high risk processing” for the purposes of applying Articles 27A, 27B and 27C, which require data controllers to designate, and specify the duties of, a “senior responsible individual” with responsibility for such processing.
Government amendments 18 to 44.
Amendment 12, in page 32, line 7, leave out clause 17.
This amendment keeps the current requirement on police in the Data Protection Act 2018 to justify why they have accessed an individual’s personal data.
Amendment 1, in clause 18, page 32, line 18, leave out paragraph (c) and insert—
“(c) omit paragraph 2,
(ca) in paragraph 3—
(i) for “data protection” substitute “high risk processing”,
(ii) in sub-paragraph (a), for “natural persons” substitute “individuals”,
(iii) in sub-paragraph (a) for “natural person” substitute “individual” in both places where it occurs,
(cb) omit paragraphs 4 and 5,”.
This amendment would leave paragraph 3 of Article 35 of the UK GDPR in place (with amendments reflecting amendments made by the Bill elsewhere in the Article), thereby ensuring that there is a definition of “high risk processing” on the face of the Regulation.
Amendment 226, page 39, line 38, leave out clause 26.
Amendment 227, page 43, line 2, leave out clause 27.
Amendment 228, page 46, line 32, leave out clause 28.
Government amendment 45.
Amendment 235, page 57, line 29, leave out clause 34.
This amendment would leave in place the existing regime, which refers to “manifestly unfounded” or excessive requests to the Information Commissioner, rather than the proposed change to “vexatious” or excessive requests.
Government amendments 46 and 47.
Amendment 237, in clause 48, page 77, line 4, leave out “individual” and insert “person”.
This amendment and Amendments 238 to 240 are intended to enable the digital verification services covered by the Bill to include verification of organisations as well as individuals.
Amendment 238, page 77, line 5, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 239, page 77, line 6, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 240, page 77, line 7, leave out “individual” and insert “person”.
See explanatory statement to Amendment 237.
Amendment 241, page 77, line 8, at end insert (on new line)—
“and the facts which may be so ascertained, verified or confirmed may include the fact that an individual has a claimed connection with a legal person.”
This amendment would ensure that the verification services covered by the Bill will include verification that an individual has a claimed connection with a legal person.
Government amendments 48 to 50.
Amendment 280, in clause 49, page 77, line 13, at end insert—
“(2A) The DVS trust framework must include a description of how the provision of digital verification services is expected to uphold the Identity Assurance Principles.
(2B) Schedule (Identity Assurance Principles) describes each Identity Assurance Principle and its effect.”
Amendment 281, page 77, line 13, at end insert—
“(2A) The DVS trust framework must allow valid attributes to be protected by zero-knowledge proof and other decentralised technologies, without restriction upon how and by whom those proofs may be held or processed.”
Government amendments 51 to 66.
Amendment 248, in clause 52, page 79, line 7, at end insert—
“(1A) A determination under subsection (1) may specify an amount which is tiered to the size of the person and its role as specified in the DVS trust framework.”
This amendment would enable fees for application for registration in the DVS register to be determined on the basis of the size and role of the organisation applying to be registered.
Amendment 243, page 79, line 8, after “may”, insert “not”.
This amendment would provide that the fee for application for registration in the DVS register could not exceed the administrative costs of determining the application.
Government amendment 67.
Amendment 244, page 79, line 13, after “may”, insert “not”.
This amendment would provide that the fee for continued registration in the DVS register could not exceed the administrative costs of that registration.
Government amendment 68.
Amendment 245, page 79, line 21, at end insert—
“(10) The fees payable under this section must be reviewed every two years by the National Audit Office.”
This amendment would provide that the fees payable for DVS registration must be reviewed every two years by the NAO.
Government amendments 69 to 77.
Amendment 247, in clause 54, page 80, line 38, after “person”, insert “or by other parties”.
This amendment would enable others, for example independent experts, to make representations about a decision to remove a person from the DVS register, as well as the person themselves.
Amendment 246, page 81, line 7, at end insert—
“(11) The Secretary of State may not exercise the power granted by subsection (1) until the Secretary of State has consulted on proposals for how a decision to remove a person from the DVS register will be reached, including—
(a) how information will be collected from persons impacted by a decision to remove the person from the register, and from others;
(b) how complaints will be managed;
(c) how evidence will be reviewed;
(d) what the burden of proof will be on which a decision will be based.”
This amendment would provide that the power to remove a person from the DVS register could not be exercised until the Secretary of State had consulted on the detail of how a decision to remove would be reached.
Government amendments 78 to 80.
Amendment 249, in clause 62, page 86, line 17, at end insert—
“(3A) A notice under this section must give the recipient of the notice an opportunity to consult the Secretary of State on the content of the notice before providing the information required by the notice.”
This amendment would provide an option for consultation between the Secretary of State and the recipient of an information notice before the information required by the notice has to be provided.
Government amendment 81.
Amendment 242, in clause 63, page 87, line 21, leave out “may” and insert “must”.
This amendment would require the Secretary of State to make arrangements for a person to exercise the Secretary of State’s functions under this Part of the Bill, so that an independent regulator would perform the relevant functions and not the Secretary of State.
Amendment 250, in clause 64, page 87, line 34, at end insert—
“(1A) A report under subsection (1) must include a report on any arrangements made under section 63 for a third party to exercise functions under this Part.”
This amendment would require information about arrangements for a third party to exercise functions under this Part of the Bill to be included in the annual reports on the operation of the Part.
Government amendments 82 to 196.
Amendment 6, in clause 83, page 107, leave out from line 26 to the end of line 34 on page 108.
This amendment would leave out the proposed new regulation 6B of the PEC Regulations, which would enable consent to be given, or an objection to be made, to cookies automatically.
Amendment 217, page 109, line 20, leave out clause 86.
This amendment would leave out the clause which would enable the sending of direct marketing electronic mail on a “soft opt-in” basis.
Amendment 218, page 110, line 1, leave out clause 87.
This amendment would remove the clause which would enable direct marketing for the purposes of democratic engagement. See also Amendment 220.
Government amendments 253 to 255.
Amendment 219, page 111, line 6, leave out clause 88.
This amendment is consequential on Amendment 218.
Government amendments 256 to 265.
Amendment 7, in clause 89, page 114, line 12, at end insert—
“(2A) A provider of a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty under this regulation.”
This amendment would clarify that a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty to notify the Commissioner of unlawful direct marketing.
Amendment 8, page 117, line 3, at end insert—
“(5) In regulation 1—
(a) at the start, insert “(1)”;
(b) after “shall”, insert “save for regulation 26A”;
(c) at end, insert—
“(2) Regulation 26A comes into force six months after the Commissioner has published guidance under regulation 26C (Guidance in relation to regulation 26A).””
This amendment would provide for the new regulation 26A, Duty to notify Commissioner of unlawful direct marketing, not to come into force until six months after the Commissioner has published guidance in relation to that duty.
Government amendment 197.
Amendment 251, in clause 101, page 127, line 3, leave out “and deaths” and insert “, deaths and deed polls”.
This amendment would require deed poll information to be kept to the same standard as records of births and deaths.
Amendment 9, page 127, line 24, at end insert—
“(2A) After section 25, insert—
“25A Review of form in which registers are to be kept
(1) The Secretary of State must commission a review of the provisions of this Act and of related legislation, with a view to the creation of a single digital register of births and deaths.
(2) The review must consider and make recommendations on the effect of the creation of a single digital register on—
(a) fraud,
(b) data collection, and
(c) ease of registration.
(3) The Secretary of State must lay a report of the review before each House of Parliament within six months of this section coming into force.””
This amendment would insert a new section into the Births and Deaths Registration Act 1953 requiring a review of relevant legislation, with consideration of creating a single digital register for registered births and registered deaths and recommendations on the effects of such a change on reducing fraud, improving data collection and streamlining digital registration.
Government amendment 198.
Amendment 229, in clause 112, page 135, line 8, leave out subsections (2) and (3).
Amendment 10, in clause 113, page 136, line 35, leave out
“which allows or confirms the unique identification of that individual”.
This amendment would amend the definition of “biometric data” for the purpose of the oversight of law enforcement biometrics databases so as to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.
Government amendments 199 to 207.
Government new schedule 1—Power to require information for social security purposes.
Government new schedule 2—National Underground Asset Register: monetary penalties.
New schedule 3—Identity Assurance Principles—
“Part 1
Definitions
1 These Principles are limited to the processing of Identity Assurance Data (IdA Data) in an Identity Assurance Service (e.g. establishing and verifying identity of a Service User; conducting a transaction that uses a user identity; maintaining audit requirements in relation a transaction associated with the use of a service that needs identity verification etc.). They do not cover, for example, any data used to deliver a service, or to measure its quality.
2 In the context of the application of the Identity Assurance Principles to an Identity Assurance Service, “Identity Assurance Data” (“IdA Data”) means any recorded information that is connected with a “Service User” including—
“Audit Data.” This includes any recorded information that is connected with any log or audit associated with an Identity Assurance Service.
“General Data.” This means any other recorded information which is not personal data, audit data or relationship data, but is still connected with a “Service User”.
“Personal Data.” This takes its meaning from the Data Protection Act 2018 or subsequent legislation (e.g. any recorded information that relates to a “Service User” who is also an identified or identifiable living individual).
“Relationship Data.” This means any recorded information that describes (or infers) a relationship between a “Service User”, “Identity Provider” or “Service Provider” with another “Service User”, “Identity Provider” or “Service Provider” and includes any cookie or program whose purpose is to supply a means through which relationship data are collected.
3 Other terms used in relation to the Principles are defined as follows—
“save-line2Identity Assurance Service.” This includes relevant applications of the technology (e.g. hardware, software, database, documentation) in the possession or control of any “Service User”, “Identity Provider” or “Service Provider” that is used to facilitate identity assurance activities; it also includes any IdA Data processed by that technology or by an Identity Provider or by a Service Provider in the context of the Service; and any IdA Data processed by the underlying infrastructure for the purpose of delivering the IdA service or associated billing, management, audit and fraud prevention.
“Identity Provider.” This means the certified individual or certified organisation that provides an Identity Assurance Service (e.g. establishing an identity, verification of identity); it includes any agent of a certified Identity Provider that processes IdA data in connection with that Identity Assurance Service.
“Participant.” This means any “Identity Provider”, “Service Provider” or “Service User” in an Identity Assurance Service. A “Participant” includes any agent by definition.
“Processing.” In the context of IdA data means “collecting, using, disclosing, retaining, transmitting, copying, comparing, corroborating, correlating, aggregating, accessing” the data and includes any other operation performed on IdA data.
“Provider.” Includes both “Identity Provider” and/or “Service Provider”.
“Service Provider.” This means the certified individual or certified organisation that provides a service that uses an Identity Provider in order to verify identity of the Service User; it includes any agent of the Service Provider that processes IdA data from an Identity Assurance Service.
“Service User.” This means the person (i.e. an organisation (incorporated or not)) or an individual (dead or alive) who has established (or is establishing) an identity with an Identity Provider; it includes an agent (e.g. a solicitor, family member) who acts on behalf of a Service User with proper authority (e.g. a public guardian, or a Director of a company, or someone who possesses power of attorney). The person may be living or deceased (the identity may still need to be used once its owner is dead, for example by an executor).
“Third Party.” This means any person (i.e. any organisation or individual) who is not a “Participant” (e.g. the police or a Regulator).
Part 2
The Nine Identity Assurance Principles
Any exemptions from these Principles must be specified via the “Exceptional Circumstances Principle”. (See Principle 9).
1 User Control Principle
Statement of Principle: “I can exercise control over identity assurance activities affecting me and these can only take place if I consent or approve them.”
1.1 An Identity Provider or Service Provider must ensure any collection, use or disclosure of IdA data in, or from, an Identity Assurance Service is approved by each particular Service User who is connected with the IdA data.
1.2 There should be no compulsion to use the Identity Assurance Service and Service Providers should offer alternative mechanisms to access their services. Failing to do so would undermine the consensual nature of the service.
2 Transparency Principle
Statement of Principle: “Identity assurance can only take place in ways I understand and when I am fully informed.”
2.1 Each Identity Provider or Service Provider must be able to justify to Service Users why their IdA data are processed. Ensuring transparency of activity and effective oversight through auditing and other activities inspires public trust and confidence in how their details are used.
2.2 Each Service User must be offered a clear description about the processing of IdA data in advance of any processing. Identity Providers must be transparent with users about their particular models for service provision.
2.3 The information provided includes a clear explanation of why any specific information has to be provided by the Service User (e.g. in order that a particular level of identity assurance can be obtained) and identifies any obligation on the part of the Service User (e.g. in relation to the User’s role in securing his/her own identity information).
2.4 The Service User will be able to identify which Service Provider they are using at any given time.
2.5 Any subsequent and significant change to the processing arrangements that have been previously described to a Service User requires the prior consent or approval of that Service User before it comes into effect.
2.6 All procedures, including those involved with security, should be made publicly available at the appropriate time, unless such transparency presents a security or privacy risk. For example, the standards of encryption can be identified without jeopardy to the encryption keys being used.
3 Multiplicity Principle
Statement of Principle: “I can use and choose as many different identifiers or identity providers as I want to.”
3.1 A Service User is free to use any number of identifiers that each uniquely identifies the individual or business concerned.
3.2 A Service User can use any of his identities established with an Identity Provider with any Service Provider.
3.3 A Service User shall not be obliged to use any Identity Provider or Service Provider not chosen by that Service User; however, a Service Provider can require the Service User to provide a specific level of Identity Assurance, appropriate to the Service User’s request to a Service Provider.
3.4 A Service User can choose any number of Identity Providers and where possible can choose between Service Providers in order to meet his or her diverse needs. Where a Service User chooses to register with more than one Identity Provider, Identity Providers and Service Providers must not link the Service User’s different accounts or gain information about their use of other Providers.
3.5 A Service User can terminate, suspend or change Identity Provider and where possible can choose between Service Providers at any time.
3.6 A Service Provider does not know the identity of the Identity Provider used by a Service User to verify an identity in relation to a specific service. The Service Provider knows that the Identity Provider can be trusted because the Identity Provider has been certified, as set out in GPG43 – Requirements for Secure Delivery of Online Public Services (RSDOPS).
4 Data Minimisation Principle
Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”
4.1 Identity Assurance should only be used where a need has been established and only to the appropriate minimum level of assurance.
4.2 Identity Assurance data processed by an Identity Provider or a Service Provider to facilitate a request of a Service User must be the minimum necessary in order to fulfil that request in a secure and auditable manner.
4.3 When a Service User stops using a particular Identity Provider, their data should be deleted. Data should be retained only where required for specific targeted fraud, security or other criminal investigation purposes.
5 Data Quality Principle
Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”
5.1 Service Providers should enable Service Users (or authorised persons, such as the holder of a Power of Attorney) to be able to update their own personal data, at a time at their choosing, free of charge and in a simple and easy manner.
5.2 Identity Providers and Service Providers must take account of the appropriate level of identity assurance required before allowing any updating of personal data.
6 Service User Access and Portability Principle
Statement of Principle: “I have to be provided with copies of all of my data on request; I can move/remove my data whenever I want.”
6.1 Each Identity Provider or Service Provider must allow, promptly, on request and free of charge, each Service User access to any IdA data that relates to that Service User.
6.2 It shall be unlawful to make it a condition of doing anything in relation to a Service User to request or require that Service User to request IdA data.
6.3 The Service User must be able to require an Identity Provider to transfer his personal data, to a second Identity Provider in a standard electronic format, free of charge and without impediment or delay.
7 Certification Principle
Statement of Principle: “I can have confidence in the Identity Assurance Service because all the participants have to be certified against common governance requirements.”
7.1 As a baseline control, all Identity Providers and Service Providers will be certified against a shared standard. This is one important way of building trust and confidence in the service.
7.2 As part of the certification process, Identity Providers and Service Providers are obliged to co-operate with the independent Third Party and accept their impartial determination and to ensure that contractual arrangements—
• reinforce the application of the Identity Assurance Principles
• contain a reference to the independent Third Party as a mechanism for dispute resolution.
7.3 In the context of personal data, certification procedures include the use of Privacy Impact Assessments, Security Risk Assessments, Privacy by Design concepts and, in the context of information security, a commitment to using appropriate technical measures (e.g. encryption) and ever improving security management. Wherever possible, such certification processes and security procedures reliant on technical devices should be made publicly available at the appropriate time.
7.4 All Identity Providers and Service Providers will take all reasonable steps to ensure that a Third Party cannot capture IdA data that confirms (or infers) the existence of relationship between any Participant. No relationships between parties or records should be established without the consent of the Service User.
7.5 Certification can be revoked if there is significant non-compliance with any Identity Assurance Principle.
8 Dispute Resolution Principle
Statement of Principle: “If I have a dispute, I can go to an independent Third Party for a resolution.”
8.1 A Service User who, after a reasonable time, cannot, or is unable, to resolve a complaint or problem directly with an Identity Provider or Service Provider can call upon an independent Third Party to seek resolution of the issue. This could happen for example where there is a disagreement between the Service User and the Identity Provider about the accuracy of data.
8.2 The independent Third Party can resolve the same or similar complaints affecting a group of Service Users.
8.3 The independent Third Party can co-operate with other regulators in order to resolve problems and can raise relevant issues of importance concerning the Identity Assurance Service.
8.4 An adjudication/recommendation of the independent Third Party should be published. The independent Third Party must operate transparently, but detailed case histories should only be published subject to appropriate review and consent.
8.5 There can be more than one independent Third Party.
8.6 The independent Third Party can recommend changes to standards or certification procedures or that an Identity Provider or Service Provider should lose their certification.
9 Exceptional Circumstances Principle
Statement of Principle: “Any exception has to be approved by Parliament and is subject to independent scrutiny.”
9.1 Any exemption from the application of any of the above Principles to IdA data shall only be lawful if it is linked to a statutory framework that legitimises all Identity Assurance Services, or an Identity Assurance Service in the context of a specific service. In the absence of such a legal framework then alternative measures must be taken to ensure, transparency, scrutiny and accountability for any exceptions.
9.2 Any exemption from the application of any of the above Principles that relates to the processing of personal data must also be necessary and justifiable in terms of one of the criteria in Article 8(2) of the European Convention of Human Rights: namely in the interests of national security; public safety or the economic well-being of the country; for the prevention of disorder or crime; for the protection of health or morals, or for the protection of the rights and freedoms of others.
9.3 Any subsequent processing of personal data by any Third Party who has obtained such data in exceptional circumstances (as identified by Article 8(2) above) must be the minimum necessary to achieve that (or another) exceptional circumstance.
9.4 Any exceptional circumstance involving the processing of personal data must be subject to a Privacy Impact Assessment by all relevant “data controllers” (where “data controller” takes its meaning from the Data Protection Act).
9.5 Any exemption from the application of any of the above Principles in relation to IdA data shall remain subject to the Dispute Resolution Principle.”
Amendment 220, in schedule 1, page 141, leave out from line 21 to the end of line 36 on page 144.
This amendment would remove from the new Annex 1 of the UK GDPR provisions which would enable direct marketing for the purposes of democratic engagement. See also Amendment 218.
Government amendments 266 to 277.
Government amendments 208 to 211.
Amendment 15, in schedule 5, page 154, line 2, at end insert—
“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”
This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer.
Amendment 14, page 154, line 25, at end insert—
“5. In relation to special category data, the Information Commissioner must assess whether the data protection test is met for data transfer to a third country or international organisation.”
This amendment requires the Information Commission to assess suitability for international transfer of special category data to a third country or international organisation.
Amendment 13, page 154, line 30, leave out “ongoing” and insert “annual”.
This amendment mandates that a country’s suitability for international transfer of data is monitored on an annual basis.
Amendment 16, in schedule 6, page 162, line 36, at end insert—
“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”
This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer in relation to law enforcement processing.
Government amendment 212.
Amendment 231, in schedule 13, page 202, line 33, at end insert—
“(2A) A person may not be appointed under sub-paragraph (2) unless the Science, Innovation and Technology Committee of the House of Commons has endorsed the proposed appointment.”
This amendment would ensure that non-executive members of the Information Commission may not be appointed unless the Science, Innovation and Technology Committee has endorsed the Secretary of State’s proposed appointee.
Government amendments 213 to 216.
The current one-size-fits-all, top-down approach to data protection that we inherited from the European Union has led to public confusion, which has impeded the effective use of personal data to drive growth and competition, and to support key innovations. The Bill seizes on a post-Brexit opportunity to build on our existing foundations and create an innovative, flexible and risk-based data protection regime. This bespoke model will unlock the immense possibilities of data use to improve the lives of everyone in the UK, and help make the UK the most innovative society in the world through science and technology.
I want to make it absolutely clear that the Bill will continue to maintain the highest standards of data protection that the British people rightly expect, but it will also help those who use our data to make our lives healthier, safer and more prosperous. That is because we have convened industry leaders and experts to co-design the Bill at every step of the way. We have held numerous roundtables with both industry experts in the field and campaigning groups. The outcome, I believe, is that the legislation will ensure our regulation reflects the way real people live their lives and run their businesses.
I am grateful to the Minister for giving way so early. Oxford West and Abingdon has a huge number of spin-offs and scientific businesses that have expressed concern that any material deviation on standards, particularly European Union data adequacy, would entangle them in more red tape, rather than remove it. He says he has spoken to industry leaders. Have he and his Department assessed the risk of any deviation? Is there any associated cost to businesses from any potential deviation? Who is going to bear that cost?
I share the hon. Lady’s appreciation of the importance of data adequacy with the European Union. It is not the case that we have to replicate every aspect of GDPR to be assessed as adequate by the European Union for the purposes of data exchange. Indeed, a number of other countries have data adequacy, even though they do not have precisely the same framework of data protection legislation.
In drawing up the measures in the Bill, we have been very clear that we do not wish to put data adequacy at risk, and we are confident that nothing in the Bill does so. That is not only my view; it is the view of the expert witnesses who gave evidence in Committee. It is also the view of the Information Commissioner, who has been closely involved in all the measures before us today. I recognise the concern, but I do not believe it has any grounds.
The Minister says, “We do not wish”. Is that a guarantee from the Dispatch Box that there will be absolutely no deviation that causes a material difference for businesses on EU data adequacy? Can he give that guarantee?
I can guarantee that there is nothing in the Government’s proposals that we believe puts data adequacy at risk. That is not just our view; it is the view of all those we have consulted, including the Information Commissioner. He was previously the information commissioner in New Zealand, which has its own data protection laws but is, nevertheless, recognised as adequate by the EU. He is very familiar with the process required to achieve and keep data adequacy, and it is his view, as well as ours, that the Bill achieves that objective.
We believe the Government amendments will strengthen the fundamental elements of the Bill and reflect the Government’s commitment to unleashing the power of data across our economy and society. I have already thanked all the external stakeholders who have worked with us to ensure that the Bill functions at its best. Taken together, we believe these amendments will benefit the economy by £10.6 billion over the next 10 years. That is more than double the estimated impact of the Bill when it was introduced in the spring.
Will the Minister confirm that no services will rely on digital identity checks?
I will come on to that, because we have tabled a few amendments on digital verification and the accreditation of digital identity.
We are proposing a voluntary framework. We believe that using digital identity has many advantages, and those will become greater as the technology improves, but there is no compulsory or mandatory element to the use of digital identity. I understand why the hon. Lady raises that point, and I am happy to give her that assurance.
Before my right hon. Friend moves on to the specifics of the Government amendments, may I ask him about something they do not yet cover? The Bill does not address the availability of data to researchers so that they can assist in the process of, for example, identifying patterns in online safety. He will know that there was considerable discussion of this during the passage of the Online Safety Act 2023, when a succession of Ministers said that we might return to the subject in this Bill. Will he update the House on how that is going? When might we expect to see amendments to deal with this important area?
It is true that we do not have Government amendments to that effect, but it is a central part of the Bill that we have already debated in Committee. Making data more available to researchers is, indeed, an objective of the Bill, and I share my right hon. and learned Friend’s view that it will produce great value. If he thinks more needs to be done in specific areas, I would be very happy to talk to him further or to respond in writing.
Broadly speaking, we support this measure. What negotiations and discussions has the Minister had about red notices under Interpol and the abuse of them, for instance by the Russian state? We have concerns about decent people being maltreated by the Russian state through the use of red notices. Are those concerns conflicted by the measure that the Government are introducing?
As the hon. Gentleman knows, I strongly share his view about the need to act against abuse of legal procedures by the Russian state. As he will appreciate, this aspect of the Bill emanated from the Home Office. However, I have no doubt that my colleagues in the Home Office will have heard the perfectly valid point he makes. I hope that they will be able to provide him with further information about it, and I will draw the matter to their attention.
I wish to say just a few more words about the biometric material received from our international partners, as a tool in protecting the public from harm. Sometimes, counter-terrorism police receive biometrics from international partners with identifiable information. Under current laws, they are not allowed to retain these biometrics unless they were taken in the past three years. That can make it harder for our counter-terrorism police to carry out their job effectively. That is why we are making changes to allow the police to take proactive steps to pseudonymise biometric data received from international partners—obviously, that means holding the material without including information that identifies the person—and hold indefinitely under existing provisions in the Counter-Terrorism Act information that identifies the person it relates to. Again, those changes have been requested by counter-terrorism police and will support them to better protect the British public.
The national underground asset register, or NUAR, is a digital map that will improve both the efficiency and safety of underground works, by providing secure access to privately and publicly owned location data about the pipes and cables beneath our feet. This will underpin the Government’s priority to get the economy growing by expediting projects such as new roads, new houses and broadband roll-out—the hon. Gentleman and I also share a considerable interest in that.
The NUAR will bring together valuable data from more than 700 public and private sector organisations about the location of underground utilities assets. This will deliver £490 million per year of economic growth, through increased efficiency, reduced asset strikes and reduced disruptions for citizens and businesses. Once operational, the running of the register will be funded by those who benefit most. The Government’s amendments include powers to, through regulations, levy charges on apparatus owners and request relevant information. The introduction of reasonable charges payable by those who benefit from the service, rather than the taxpayer, will ensure that the NUAR is a sustainable service for the future. Other amendments will ensure that there is the ability to realise the full potential of this data for other high-value uses, while respecting the rights of asset owners.
Is any consideration given to the fact that that information could be used by bad actors? If people are able to find out where particular cables or pipes are, they also have the ability to find weakness in the system, which could have implications for us all.
I understand the hon. Lady’s point. There would need to be a legitimate purpose for accessing such information and I am happy to supply her with further detail about precisely how that works.
The hon. Lady intervenes at an appropriate point, because I was about to say that the provision will allow the National Underground Asset Register service to operate in England and Wales. We intend to bring forward equivalent provisions as the Bill progresses in the other House, subject to the usual agreements, to allow the service to operate in Northern Ireland, but the Scottish Road Works Commissioner currently maintains its own register. It has helped us in the development of the NUAR, so the hon. Lady may like to talk to the Scottish Road Works Commissioner on that point.
I turn to the use of data for the purposes of democratic engagement, which is an issue of considerable interest to Members of the House. The Bill includes provisions to facilitate the responsible use of personal data by elected representatives, registered political parties and others for the purposes of “democratic engagement”. We have tabled further related amendments for consideration today, including adding a fuller definition of what constitutes “democratic engagement activities” to help the reader understand that term wherever it appears in the legislation.
The amendments provide for former MPs to continue to process personal data following a successful recall petition, to enable them to complete urgent casework or hand over casework to a successor, as they do following the Dissolution of Parliament. For consistency, related amendments are made to the definitions used in provisions relating to direct marketing for the purposes of democratic engagement.
Finally, hon. Members may be aware that the Data Protection Act 2018 currently permits registered political parties to process sensitive political opinions data without consent for the purposes of their political activities. The exemption does not however currently apply to elected representatives, candidates, recall petitioners and permitted participants in referendums. The amendment addresses that anomaly and allows those individuals to benefit from the same exemption as registered political parties.
Is the Minister prepared to look at how the proposals in the Bill and the amendments align with relevant legislation passed in the Scottish Government? A number of framework Bills to govern the operation of potential future referendums on a variety of subjects have been passed, particularly the Referendums (Scotland) Act 2020. It is important that there is alignment with the definitions used in the Bill, such as that for “a permitted participant”. Will he commit to looking at that and, if necessary, make changes to the Bill at a later stage in its progress, in discussion with the Scottish Government?
I am happy to look at that, as the hon. Gentleman suggests. I hope the changes we are making to the Bill will provide greater legal certainty for MPs and others who undertake the processing of personal data for the purposes of democratic engagement.
The Bill starts and ends with reducing burdens on businesses and, above all, on small businesses, which account for over 99% of UK firms. In the future, organisations will need to keep records of their processing activities only when those activities are likely to result in a high risk to individuals. Some organisations have queried whether that means they will have to keep records in relation to all their activities if only some of their processing activities are high risk. That is not the Government’s intention. To maximise the benefits to business and other organisations, the amendments make it absolutely clear that organisations have to keep records only in relation to their high-risk processing activities.
The Online Safety Act 2023 took crucial steps to shield our children, and it is also important that we support grieving families who are seeking answers after tragic events where a child has taken their own life, by removing obstacles to accessing social media information that could be relevant to the coroner’s investigations.
We welcome such measures, but is the Minister aware of the case of Breck Bednar, who was groomed and then murdered? His family is campaigning not just for new clause 35 but for measures that go further. In that case, the coroner would have wanted access to Breck’s online life but, as it currently stands, new clause 35 does not provide what the family needs without a change to widen the scope of the amendment to the Online Safety Act. Will the Minister look at that? I think it will just require a tweak in some of the wording.
I understand the concerns of the hon. Lady. We want to do all that we can to support the bereaved parents of children who have lost their lives. As it stands, the amendment will require Ofcom, following notification from a coroner, to issue information notices to specified providers of online services, requiring them to hold data they may have relating to a deceased child’s use of online services, in circumstances where the coroner suspects the child has taken their own life, which could later be required by a coroner as relevant to an inquest.
We will continue to work with bereaved families and Members of the other place who have raised concerns. During the passage of the Online Safety Act, my noble colleague Lord Parkinson of Whitley Bay made it clear that we are aware of the importance of data preservation to bereaved parents, coroners and others involved in investigations. It is very important that we get this right. I hear what the hon. Lady says and give her an assurance that we will continue to work across Government, with the Ministry of Justice and others, in ensuring that we do so.
The hon. Member for Rhondda made reference to proposed new schedule 1, relating to improving our ability to identify and tackle fraud in the welfare system. I am grateful for the support of the Minister for Disabled People, Health and Work, my hon. Friend the Member for Corby (Tom Pursglove). In 2022-23, the Department for Work and Pensions overpaid £8.3 billion in fraud and error. A major area of loss is the under-declaration of financial assets, which we cannot currently tackle through existing powers. Given the need to address the scale of fraud and error in the welfare system, we need to modernise and strengthen the legal framework, to allow the Department for Work and Pensions to keep pace with change and stand up to future fraud challenges.
As I indicated earlier, the fraud plan, published in 2022, contains a provision outlining the DWP’s intention to bring forward new powers that would boost access to data held by third parties. The amendment will enable the DWP to access data held by third parties at scale where the information signals potential fraud or error. That will allow the DWP to detect fraud and error more proactively and protect taxpayers’ money from falling into the hands of fraudsters.
My reading of the proposed new schedule is that it gives the Department the power to look into the bank accounts of people claiming the state pension. Am I right about that?
The purpose of the proposed new schedule is narrowly focused. It will ensure that where benefit claimants may also have considerable financial assets, that is flagged with the DWP for further examination, but it does not allow people to go through the contents of people’s bank accounts. It is an alarm system where financial institutions that hold accounts of benefit claimants can match those against financial assets, so where it appears fraud might be taking place, they can refer that to the Department.
I am surprised that the Opposition regard this as something to question. Obviously, they are entitled to seek further information, but I would hope that they share the wish to identify where fraud is taking place and take action against it. This is about claimants of benefits, including universal credit—
The state pension will not currently be an area of focus for the use of these powers.
The House of Commons Library makes it absolutely clear that the Bill, if taken forward in the way that the Government are proposing at the moment, does allow the Government to look at people in receipt of state pensions. That is the case, is it not?
I can tell the hon. Gentleman that it is not the case that the DWP intends to focus on the state pension—and that is confirmed by my hon. Friend the Member for Corby. This is specifically about ensuring that means-related benefit claimants are eligible for the benefits for which they are currently claiming. In doing that, the identification and the avoidance of fraud will save the taxpayer a considerable amount of money.
I think everybody in the House understands the importance of getting this right. We all want to stop fraud in the state system. That being said, this is the only time that I am aware of where the state seeks the right to put people under surveillance without prior suspicion, and therefore such a power has to be restricted very carefully indeed. As we are not going to have time to debate this properly today, is my right hon. Friend open to having further discussion on this issue when the Bill goes to the Lords, so that we can seek further restrictions? I do not mean to undermine the effectiveness of the action; I just want to make it more targeted.
I am very grateful to my right hon. Friend for his contribution, and I share his principled concern that the powers of the state should be limited to those that are absolutely necessary. Those who are in receipt of benefits funded by the taxpayer have an obligation to meet the terms of those benefits, and this provision is one way of ensuring that they do so. My hon. Friend the Member for Corby has already said that he would be very happy to discuss this matter with my right hon. Friend further, and I am happy to do the same if that is helpful to him.
Can the Minister give us an example of the circumstances in which the Department would need to look into the bank accounts of people claiming state pensions in order to tackle the fraud problem? Why is the state pension within the scope of this amendment?
All I can say to the right hon. Gentleman is that the Government have made it clear that there is no intention to focus on claimants of the state pension. That is an undertaking that has been given. I am sure that Ministers from the DWP would be happy to give further evidence to the right hon. Gentleman, who may well wish to look at this further in his Committee.
Finally, I wish to touch on the framework around smart data, which is contained in part 3 of the Bill. The smart data powers will extend the Government’s ability to introduce smart data schemes, building on the success of open banking, which is the UK’s most developed data sharing scheme, with more than 7 million active users. The amendments will support the Government’s ability to meet their commitment, first, to provide open banking with a long-term regulatory framework, and, secondly, to establish an open data scheme for road fuel prices. It will also more generally strengthen the toolkit available to Government to deliver future smart data schemes.
The amendments ensure that the range of data and activities essential to smart data schemes are better captured and more accurately defined. That includes types of financial data and payment activities that are integral to open banking. The amendments, as I say, are complicated and technical and therefore I will not go into further detail.
I will give way to my hon. Friend as I know that he has taken a particular interest, and is very knowledgeable, in this area.
The Minister is very kind. I just wanted to pick up on his last point about smart data. He is right to say that the provisions are incredibly important and potentially extremely valuable to the economy. Can he just clarify a couple of points? I want to be clear on Government new clause 27 about interface bodies. Does that apply to the kinds of new data standards that will be required under smart data? If it does, can he please clarify how he will make sure that we do not end up with multiple different standards for each sector of our economy? It is absolutely in everybody’s interests that the standards are interoperable and, to the greatest possible extent, common between sectors so that they can talk to each other?
I do have a note on interface bodies, which I am happy to include for the benefit of my hon. Friend. However, he will be aware that this is a technical and complicated area. If he wants to pursue a further discussion, I would of course be happy to oblige. I can tell him that the amendments will ensure that smart data schemes can replicate and build on the open banking model by allowing the Government to require interface bodies to be set up by members of the scheme. Interface bodies will play a similar role to that of the open banking implementation entity, developing common standards on arrangements for data sharing. Learning from the lessons and successes of the open banking regime, regulations will be able to specify the responsibilities and requirements for interface bodies and ensure appropriate accountability to regulators. I hope that that goes some way to addressing the point that he makes, but I would be happy to discuss it further with him in due course.
I believe these amendments will generally improve the functioning of the Bill and address some specific concerns that I have identified. On that basis, I commend them to the House.
As I am feeling generous, I shall start with the nice bits where we agree with the Government. First, we completely agree with the changes to the Information Commissioner’s Office, strengthening the ICO’s enforcement powers, restructuring the ICO and providing a clearer framework of objectives. As the Minister knows, we have always been keen to strengthen the independence of the ICO and we were concerned that the Government were taking new interventionist powers—that is quite a theme in this Bill—in clause 33, so we welcome Government amendment 45, which achieves a much better balance between democratic oversight and ICO independence, so we thank the Minister for that.
Labour also welcomes part 2 of the Bill, as amended in Committee, establishing a digital verification framework. My concern, however, is that the Government have underestimated the sheer technicality of such an endeavour, hence the last-minute requirement for tens of Government amendments to this part of the Bill, which I note the Minister keeps on referring to as being very technical and therefore best to be debated in another place at another time with officials present. Under Government amendment 52, for example, different rules will be established for different digital verification services, and I am not quite sure whether that will stand the test of the House of Lords.
We warmly welcome and support part 3 of the Bill, which has just been referred to by the hon. Member for Weston-super-Mare (John Penrose) and the Minister, and its provisions on smart data. Indeed, we and many industry specialists have been urging the Government to go much faster in this particular area. The potential for introducing smart data schemes is vast, empowering consumers to make financial decisions that better suit them, enabling innovation and delivering better products and services. Most notably, that has already happened in relation to financial services. Many people will not know that that is what they are using when they use a software that is accessing several different bank accounts, but that is what they are doing.
In the autumn statement, the Government pledged to kickstart a smart data big bang. One area where smart data has been most effective is in open finance—it is right that we expand these provisions into new areas to have a greater social impact—but, to quote the Financial Conduct Authority, it should be implemented there
“in a proportionate phased manner, ideally driven by consideration of credible consumer propositions and use-cases.”
Furthermore, the FCA does not think that a big bang approach to open finance is feasible or desirable. Nevertheless, many of the Government amendments to the suite of smart data provisions are technical, and indicate a move in the right direction. In particular, we hope that, with smart data enabling greater access by consumers to information about green options and net zero, we will be able to help the whole of the UK to move towards net zero.
I want to say a few words on part 4, on cookies and nuisance calls. We share a lot of the Government’s intentions on tackling those issues and the births and deaths register. As a former registrar, I would like to see tombstoning—the process of fraudulently adopting for oneself the name of a child who has died—brought to an end. That practice is enabled partly because the deaths register does not actually register the death of an individual named on the births register, which I hope will at some point be possible.
Despite the Government’s having sat on the Bill for almost 18 months, with extensive consultations, drafts, amendments and carry-over motions, there are still big practical holes in these measures that need to be addressed. Labour supports the Government’s ambitions to tackle nuisance calls, which are a blight on people’s lives—we all know that. However, I fear that clause 89, which establishes a duty to notify the ICO of unlawful direct marketing, will make little or no difference without the addition of Labour amendments 7 and 8, which would implement those obligations on electronic communications companies when the guidance from the ICO on their practical application has been clearly established. As the Bill stands, that is little more than wishful thinking.
Unfortunately, the story is the same on tackling cookies. We have a bunch of half-baked measures that simply do not deliver as the public will expect them to and the Government would like them to. We all support reducing cookie fatigue; I am sure that every hon. Member happily clicks “Accept all” whenever cookies comes up—[Interruption.] Well, some Members are much more assiduous than I am in that regard. But the wise Members of the House know perfectly well that the problem is that it undermines the whole purpose of cookies. We all support tackling it because clicking a new cookie banner every time we load a web page is a waste of everybody’s time and is deeply annoying.
However, the Government’s proposed regulation 6B gives the Secretary of State a blank cheque to make provisions as they see fit, without proper parliamentary scrutiny. That is why we are unhappy with it and have tabled amendment 6, which would remove those powers from the Bill as they are simply not ready to enter the statute book. Yet again I make the point that the Bill repeatedly and regularly gives new powers to the Secretary of State. Sure, they would be implemented by secondary legislation—but as we all know, secondary legislation is unamendable and therefore subject to much less scrutiny. These are areas in which the state is taking significant powers over the public and private individuals.
Let me deal with some of the Labour party’s amendments. First, I take subject access requests. The Government have repeatedly been in the wrong place on those, I am afraid, ever since the introduction of the first iteration of the DPDI Bill under Nadine Dorries, when they tried to charge people for access to their own data. Fortunately, that has now gone the way of Nadine Dorries. [Interruption.] I note that the Minister smiled at that point. We still have concerns about the Government’s plans to change the thresholds for refusing subject access requests from “manifestly unfounded or excessive” to “vexatious or excessive”. The Equality and Human Rights Commission, Reset, the TUC and Which? have all outlined their opposition to the change, which threatens to hollow out what the Government themselves admit is a “critical transparency mechanism”.
We have tabled two simple amendments. Amendment 2 would establish an obligation on any data controller refusing a subject access request to provide evidence of why a request has been considered vexatious or excessive. Organisations should not be allowed to just declare that a request is vexatious or excessive and so ascribe a motive to the data subject in order to refuse to provide their data, perhaps simply because of the inconvenience to the organisation.
The Government will try to tell me that safeguards are in place and that the data subject can make appropriate complaints to the organisation and the ICO if they believe that their request has been wrongly refused. But if we take the provisions set out in clause 9 to extend the time limits on subject access requests, add the advantage for companies of dither and delay when considering procedural complaints, and then add the additional burden on a data subject of having to seek out the ICO and produce evidence and an explanation of their request as well as the alleged misapplication of the vexatious or excessive standard, we see that people could easily be waiting years and years before having the right to access their own data. I cannot believe that, in the end, that is in the interests of good government or that it is really what the Government want.
Despite public opposition to the measures, the Government are also now going further by introducing at this stage amendments that further water down subject access request protections. Government new clauses 7 and 9, which the Minister did not refer to—in fact, he only mentioned, I think, a bare tenth of the amendments he wants us to agree this afternoon—limit a data subject’s entitlement to their own data to the controller’s ability to conduct a “reasonable and proportionate” search. But what is reasonable and proportionate? Who determines what has been a reasonable and proportionate search? The new clauses drive a coach and horses through the rights of people to access their own data and to know who is doing what with their information. That is why Labour does not support the changes.
I come to one of the most important issues for us: high-risk processing, which, as the term suggests, is of most concern when it comes to the rights of individuals. I was pleased but perplexed to see that the Government tabled amendments to new clause 30 that added further clarity about the changed provisions to record keeping for the purposes of high-risk processing. I was pleased because it is right that safeguards should be in place when data processing is deemed to be of high risk, but I was perplexed because the Government do not define high-risk processing in the Bill—in fact, they have removed the existing standard for high-risk processing from existing GDPR, thereby leaving a legislative lacuna for the ICO to fill in. That should not be up to the ICO. I know that the ICO himself thinks that it should not be up to him, but a matter for primary legislation.
Our amendment 1 retains a statutory definition of high-risk processing as recommended by the ICO in his response to the Bill, published in May. He said:
“the detail in Article 35 (3) was a helpful and clear legislative backstop.”
That is why he supports what we are suggesting. Our amendment 4 would also clarify those individual rights even further, by again providing the necessary definition of what constitutes high risk, within the new provisions concerning the responsibilities of senior responsible individuals for data processing set out in clause 15.
I turn to automated decision making, which has the potential to deliver increasingly personalised and efficient services, to increase productivity, and to reduce administrative hurdles. While most of the world is making it harder to make decisions exclusively using ADM, clause 12 in the Bill extends the potential for automated decision making in the UK. Yet countless research projects have shown that automated decision making and machine decision making are not as impartial or blind as they sound. Algorithms can harbour and enhance inbuilt prejudices and injustices. Of course we cannot bury our heads in the sand and pretend that the technology will not be implemented or that we can legislate it out of use; we should be smart about ADM and try to unlock its potential while mitigating its potential dangers. Where people’s livelihoods are at risk or where decisions are going to have a significant impact, it is essential that extra protections are in place allowing individuals to contest decisions and secure human review as a fundamental backstop.
Our amendment 5 strikes a better balance by extending the safeguarding provisions to include significant decisions that are based both partly and solely on automated processing; I am very hopeful that the Government will accept our amendment. That means greater safeguards for anybody subject to an automated decision-making process, however that decision is made. It cannot just be a matter of “the computer says no.”
I think the Minister is slightly surprised that we are concerned about democratic engagement, but I will explain. The Bill introduces several changes to electoral practices under the guise of what the Government call “democratic engagement”, most notably through clauses 86 and 87. The former means that any political party or elected representative could engage in direct marketing relying on a soft opt-in procedure, while clause 87 allows the Secretary of State to make any future exemptions and changes to direct marketing rules for the very unspecified purposes of “democratic engagement”.
The Ada Lovelace Institute and the Internet Advertising Bureau have raised concerns about that, and in Committee Labour asked the Minister what the Government had in mind. He rather gave the game away when he wrote to my hon. Friend the Member for Barnsley East (Stephanie Peacock), to whom I pay tribute for the way she took the Bill through the Committee:
“A future government may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules.”
Switching off the rules ahead of an election—does anyone else smell a rat?
He does not—great.
Finally, new schedule 1 would grant the Secretary of State the power to require banks or other financial institutions to provide the bank account data—unspecified—of any recipient of benefits to identify
“cases which merit further consideration to establish whether relevant benefits are being paid or have been paid in accordance with the enactments and rules of law relating to those benefits.”
It is a very broad and, I would argue, poorly delineated power. My understanding from the Commons Library, although I note that the Minister was unable to answer the question properly, is that it includes the bank accounts of anyone in the UK in receipt, or having been in receipt, of state pension, universal credit, working tax credit, child tax credit, child benefit, pension credit, jobseeker’s allowance or personal independence payment.
The Minister says that the Government do not intend to go down some of those routes at the moment, but why, in that case, are they seeking that power? They could have come to us with a much more tightly written piece of legislation, and we would have been able to help them draft it properly. The proposed new schedule would mean that millions of bank accounts could be trawled without the Department for Work and Pensions, as the right hon. Member for Haltemprice and Howden (Mr Davis) referred to, even suspecting anything untoward before it asked for the information. The 19-page new schedule, which was tabled on the last day for consideration, would grant powers to the Government without our having any opportunity to scrutinise it line by line, assess its implications or hear evidence from expert witnesses.
We should of course be tackling fraud. The Government have completely lost control of fraud in recent years, with benefit fraud and error skyrocketing to £8.3 billion in the last financial year. The Minister seemed to think that it was a good thing that he could cite that figure. The year before, it was even higher—a record £8.7 billion. On the Conservative party’s watch, the percentage of benefit expenditure lost to fraud has more than trebled since Labour was last in power.
Let me be absolutely clear: Labour will pursue the fraudsters, the conmen and the claimants who try to take money from the public purse fraudulently or illegally. That includes those who have defrauded the taxpayer over personal protective equipment contracts, or have not declared their full income to His Majesty’s Revenue and Customs. My constituents in the Rhondda know that defrauding the taxpayer is one of the worst forms of theft. It is theft from all of us. It undermines confidence in the system that so many rely on. It angers people when they abide by the rules and they see others swinging the lead and getting away with it.
I back 100% any attempt to tackle fraud in the system, and we will work with the Government to get the legislation right, but this is not the way to do it, because it is not proper scrutiny. The Minister with responsibility for this matter, the Minister for Disabled People, Health and Work, who is present in the Chamber, is not even speaking in the debate. The Government are asking us to take a lot on trust, as we saw from the questions put earlier to the Minister for Data and Digital Infrastructure, so I have some more questions for him that I hope he will be able to answer.
As I understand it, the Government did a test project on this in 2017—all of six years ago—so what on earth have they been doing all this while? When was the new schedule first drafted, and why did the Minister not mention it in the discussions that he and I had two weeks ago? How many bank accounts does it potentially apply to? The Government already have powers to seek bank details where they suspect fraud, so precisely how will the new power be used? I have been told that the Government will not use the power until 2027. Is that right? If so, how on earth did they come to the figure of a £600 million saving—that was the figure that they gave yesterday, but I note that the Minister said £500 million earlier—in the first five years?
What will the cost be to the banks and financial institutions? What kind of information will the Government seek? Will it include details of where people have shopped, banked or travelled, or what they have spent their money on? The Government say that they will introduce a set of criteria specifying the power. When will that be introduced, how wide in scope will it be, what assessments will accompany it, and will it be subject to parliamentary scrutiny?
There is clearly significant potential to use data to identify fraud and error. That is something that Labour is determined to do, but it is vital that new measures are used fairly and proportionately. The Department for Work and Pensions says that its ability to test for unfair impacts across protected characteristics is limited, and the National Audit Office has also warned that machine learning risks bias towards certain vulnerable people or groups with protected characteristics. Without proper safeguards in place, the changes could have significant adverse effects on the most vulnerable people in society.
On behalf of the whole Labour party, I reiterate the offer that I made to the Government yesterday. We need to get this right. We will work with Ministers to get it right, and I very much hope that we can organise meetings after today, if the Bill passes, to ensure that the debates in the Lords are well informed and that we get to a much better understanding of what the Government intend and how we can get this right. If we get it wrong, we will undermine trust in the whole data system and in Government.
Broadly speaking, Labour supports the changes in the Bill that give greater clarity and flexibility to researchers, tech platforms and public service providers, with common-sense changes to data protection where it is overly rigid, but the Government do not need to water down essential protections for data subjects to do that. Our amendments set out clearly where we diverge from the Government and how Labour would do things differently.
By maintaining subject access request protections, establishing a definition of high-risk processing on the face of the Bill, and defending the public from automated decision making that encroaches too significantly on people’s lives, a Bill with Labour’s amendments would unlock the new potential for data that improves public services, protects workers from data power imbalances and delivers cutting-edge scientific research, while also building trust for consumers and citizens. That is the data protection regime the UK needs and that is the protection a Labour Government would have delivered.
Before I speak to my new clause, I want to address one or two of the things that the Opposition spokesman, the hon. Member for Rhondda (Sir Chris Bryant), just raised. By not accepting his motion to recommit the Bill to a Committee, we have in effect delegated large parts of the work on this important Bill to the House of Lords. I say directly to the Whip on the Treasury Bench that, when the Bill comes back to the Commons in ping-pong, I recommend that the Whips Office allows considerable time for us to debate the changes that the Lords makes. At the end of the day, this House is responsible to our constituents and these issues will have a direct impact on them, so we ought to have a strong say over what is done with respect to this Bill.
New clause 43 in my name is entitled “Right to use non-digital verification services”. Digitisation has had tremendous benefits for society. Administrative tasks that once took weeks or even years can now be done in seconds, thanks to technology, but that technology has come with considerable risks as well as problems of access. The internet is an equaliser in many ways; I can access websites and services in East Yorkshire in the same way that we do here. I can send and receive money, contact friends and family, organise families, do work, and do all sorts of other things that we could not once do.
However, the reality is more nuanced. Some people lack the technological literacy or simply the hardware to get online and make the most of what is out there—think of elderly people, the homeless and those living on the breadline. As with many things, those groups risk being left behind by the onward march of technology through no fault of their own. Indeed, some people do not want to go fully online. Many people who are perfectly au fait with the latest gadgets are none the less deeply concerned about the security of their data, and who can blame them?
My bank account has been accessed from Israel in the past. My online emails have been broken into during political battles of one sort or another. These things are risky. I hope nobody in the Chamber has forgotten the Edward Snowden revelations about the National Security Agency and GCHQ, which revealed a vast network of covert surveillance and data gathering by Government agencies from ordinary online activity, and the sharing of private information without consent. More recently, we have heard how Government agencies monitored people’s social media posts during the pandemic, and data trading by private companies is an enormous and lucrative industry.
What is more, as time passes and the rise of artificial intelligence takes hold, the ability to make use of central databases is becoming formidable. It is beyond imagination, so people are properly cautious about what data they share and how they share it. For some people—this is where the issue is directly relevant to this Bill—that caution will mean avoiding the use of digital identity verification, and for others that digital verification is simply inaccessible. The Bill therefore creates two serious problems by its underlying assumptions.
Already it is becoming extremely difficult for people to live anything approaching a normal life if they are not fully wired into the online network. If they cannot even verify who they are without that access, what are they supposed to do? That is why I want to create a right to offline verification and, in effect, offline identification. We saw earlier this year what can happen when someone is excluded from basic services, with the planned closure of Nigel Farage’s bank account. That case was not related to identification, but it made clear how much of an impact such exclusion can have on someone’s life. Those who cannot or do not wish to verify their identity digitally could end up in the same position as Farage and many others who have seen their access to banking restricted for unfair reasons.
The rise of online banking, although a great convenience for many, must not mean certain others being left out. We are talking about fairly fundamental rights here. Those people who, by inclination or otherwise, find it preferable or easier to stick to old-fashioned ways must not be excluded from society. My amendment would require that all services requiring identity verification offer a non-digital alternative, ensuring that everyone, regardless of who they are, will have the same access.
It is difficult to know where to start. The Minister described this as a Brexit opportunities Bill. Of course, Brexit was supposed to be about this place taking back control. It was to be the triumph of parliamentary sovereignty over faceless Brussels bureaucrats, the end of red tape and regulations, and the beginning of a glorious new era of freedom unencumbered by all those complicated European Union rules and requirements that did silly things like keeping people safe and protecting their human rights.
Yet here we are with 200 pages of new rules and regulations and a further 160 pages of amendments. This time last week, the amendment paper was 10 pages long; today it is 15 times that and there is barely any time for any kind of proper scrutiny. Is this what Brexit was for: to hand the Government yet more sweeping powers to regulate and legislate without any meaningful oversight in this place? To create additional burdens on businesses and public services, just for the sake of being different from the European Union? The answer to those questions is probably yes.
I will speak briefly to the SNP amendments, but I will also consider some of the most concerning Government propositions being shoehorned in at the last minute in the hope that no one will notice. How else are we supposed to treat Government new schedule 1? The Minister is trying to present it as benign, or even helpful, as if it had been the Government’s intention all along to grant the DWP powers to go snooping around in people’s bank accounts, but if it has been so long in coming, as he said, why is it being added to the Bill only now? Why was it not in the original draft, or even brought to Committee, where there could at least have been detailed scrutiny or the opportunity to table further amendments?
Of course there should be action to tackle benefit fraud—we all agree on that—but the DWP already has powers, under section 109B of the Social Security Administration Act 1992, to issue a notice to banks to share bank account information provided that they have reasonable grounds to believe that an identified, particular person has committed, or intends to commit, a benefit offence. In other words, where there is suspicion of fraud, the DWP can undertake checks on a claimant’s account. Incidentally, there should also be action to tackle tax evasion and tax fraud. The Government evidently do not require from the Bill any new powers in that area, so we can only assume that they are satisfied that they have all the powers they need and that everything possible is being done to ensure that everybody pays the tax that they owe.
The powers in new schedule 1 go much further than the powers that the DWP already has. By their own admission, the Government will allow the DWP to carry out—proactively, regularly, at scale and on a speculative basis—checks on the bank accounts and finances of claimants. The new schedule provides little in the way of safeguards or reassurances for people who may be subject to such checks. The Secretary of State said that
“only a minimum amount of data will be accessed and only in instances which show a potential risk of fraud and error”.
In that case, why is the power needed at all, given that the Government already have the power to investigate where there is suspicion of fraud? And how can only “a minimum amount” of data be accessed when the Government say in the same breath that they want to be able to carry out those checks proactively and at scale.
My hon. Friend probably shares my concern that we are moving into a new era in which the bank account details of people claiming with the DWP must be shared as a matter of course. That is the only reason I can see for such sweeping amendments, which will impact on so many people.
There is a huge risk. It is clear that the Government’s starting point is very often to avoid giving people the social security and welfare support that they might need to live a dignified life. We know that the approach in Scotland is incredibly different.
That is the thing: as with so much of this Bill, there is a good chance that minority groups or people with protected characteristics will find themselves most at risk of those checks and of coming under the proactive suspicion of the DWP. As we said when moving the committal motion, we have not had time to seek properly to interrogate that point. In his attempts to answer interventions, the Minister kind of demonstrated why scrutiny has been so inadequate. At the same time, the Government’s own Back Benchers, including the right hon. Member for Haltemprice and Howden (Mr Davis), the hon. Member for Yeovil (Mr Fysh) and others, are tabling quite thoughtful amendments—that is never a great sign for a Government. The Government should not be afraid of the kinds of safeguards and protections that they are proposing.
The SNP amendments look to remove the most dangerous and damaging aspects of the Bill—or, at the very least, to amend them slightly. Our new clause 44 and amendment 229 would have the effect of transferring the powers of the Surveillance Camera Commissioner to the Investigatory Powers Commissioner. That should not be all that controversial. Professor William Webster, a director of the Centre for Research into Information, Surveillance and Privacy, has warned that the Bill, as it stands, does not provide adequate mechanisms for the governance and oversight of surveillance cameras. The amendment would ensure that oversight is retained, the use of CCTV continues to be regulated, and public confidence in such technologies is strengthened, not eroded. CCTV is becoming more pervasive in the modern world—not least with the rise of video doorbells and similar devices that people can use in their own personal circumstances—so it is concerning that the Government are seeking to weaken rather than strengthen protections in that area.
The SNP’s amendment 222 would leave out clause 8, and our amendment 223 would leave out clause 10, removing the Government’s attempts to crack down on subject access requests. The effect of those clauses might, in the Government’s mind, remove red tape from businesses and other data-controlling organisations, but it would do so at the cost of individuals’ access to their own personal data. That is typified by the creation of a new and worryingly vague criterion of “vexatious or excessive” as grounds to refuse a subject access request. Although that might make life easier for data controllers, it will ultimately place restrictions on data subjects’ ability to access what is, we must remember, their data. There have been attempts—not just throughout Committee stage, but even today from the Opposition—to clarify exactly the thresholds for “vexatious and excessive” requests. The Government have been unable to answer, so those clauses should not be allowed to stand.
Amendment 224 also seeks to leave out clause 12, expressing the concerns of many stakeholders about the expansion in scope of automated decision making, alongside an erosion of existing protections against automated decision making. The Ada Lovelace Institute states that:
“Against an already-poor landscape of redress and accountability in cases of AI harms, the Bill’s changes will further erode the safeguards provided by underlying regulation.”
There is already significant and public concern about AI and its increasingly pervasive impact.
Clause 12 fails to offer adequate protections against automated decision making. An individual may grant consent for the processing of their data—indeed, they might have no choice but to do so—but that does not mean that they will fully understand or appreciates how that data will be processed or, importantly, how decisions will be made. At the very least, the Government should accept our amendment 225, which would require the controller to inform the data subject when an automated decision has been taken in relation to the data subject. I suspect, however, that that is unlikely—just as it is unlikely that the Government will accept Labour amendments 2 and 5, which we are happy to support—so I hope the House will have a chance to express its view on clause 12 as a whole later on.
The SNP’s amendments 226, 227 and 228 would have the effect of removing clauses 26, 27 and 28 respectively. Those clauses give the Home Secretary significant new powers to authorise the police to access personal data, and a power to issue a “national security” certificate telling the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey, which would essentially give police immunity should they use personal data in a way that would otherwise be illegal—and they would no longer need to respond to requests under the Freedom of Information Act 2000. We have heard no explanation from the Government for why they think that the police should be allowed to break the law and operate under a cover of darkness.
The Bill will also expand what counts as an “intelligence service” for the purposes of data protection law. Again, that would be at the Home Secretary’s discretion, with a power to issue a designation notice allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018—otherwise designed for the intelligence agencies—whenever they are collaborating with the security services. The Government might argue that that creates a simplified legal framework, but in reality it will hand massive amounts of people’s personal information to the police, including the private communications of people in the UK and information about their health histories, political beliefs, religious beliefs and private lives.
Neither the amended approach to national security certificates nor the new designation notice regime would be reviewable by the courts, and given that there is no duty to report to Parliament, Parliament might never find out how and when the powers have been used. If the Home Secretary said that the police needed to use those increased powers in relation to national security, his word would be final. That includes the power to handle sensitive data in ways that would otherwise, under current legislation, be criminal.
The Home Secretary is responsible for both approving and reviewing designation notices. Only a person who is directly affected by such a notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, meaning that those affected would not even know about it and could not possibly challenge it. Those are expansive broadenings not just of the powers of the secretary of state, but of the police and security services. The Government have not offered any meaningful reassurance about how those powers will be applied or what oversight will exist, which is why our amendments propose scrapping those clauses entirely.
There remain other concerns about many aspects of the Bill. The British Medical Association and the National AIDS Trust have both raised questions about patients’ and workers’ right to privacy. The BMA calls the Bill
“a departure from the existing high standards of data protection for health data”.
We welcome the amendments to that area, particularly amendment 11, tabled by the hon. Member for Jarrow (Kate Osborne), which we will be happy to support should it be selected for a vote.
I am afraid that I have to echo the concerns expressed by the Labour Front-Bench spokesman, the hon. Member for Rhondda (Sir Chris Bryant), about new clause 45, which was tabled by the hon. Member for Aberconwy (Robin Millar). That clause perhaps has laudable aims, but it is the view of the Scottish National party that it is not for this place to legislate in that way, certainly not without consultation and ideally not without consent from the devolved authorities. We look forward to hearing the hon. Member for Aberconwy make his case, but I do not think we are in a position to support his new clause at this time.
It is a pleasure to follow the hon. Members who have spoken in this very important debate. I declare an interest: I am the chair of the all-party parliamentary group on digital identity, so I have a particular interest in the ramifications of data as it relates to identity, but also in wider concepts—some of which we have heard about today—such as artificial intelligence and how our data might be used in the future.
I share quite a lot of the concerns that we have heard from both sides of the House. There is an awful lot more work to be done on the detail of the Bill, thinking about its implications for individuals and businesses; how our systems work and how our public services interact with them; and how our security and police forces interact with our data. I hope that noble Members of the other place will think very hard about those things, and I hope my right hon. Friend the Minister will meet me to discuss some of the detail of the Bill and any useful new clauses or amendments that the Government might introduce in the other place. I completely agree that we do not have much time today to go through all the detail, with a substantial number of new clauses having been added in just the past few days.
I will speak specifically to some of the amendments that stand in my name. Essentially, they are in two groupings: one group deals with the operations of the trust framework for the digital verification service, which I will come back to, and the other general area is the Henry VIII-style powers that the Bill gives to Ministers. Those powers fundamentally alter the balance that has been in place since I was elected as a Member of Parliament in terms of how individuals and their data relate to the state.
On artificial intelligence, we are at a moment in human evolution where the decisions that we make—that scientists, researchers and companies make about how they use data—are absolutely fundamental to the operation of so many areas of our lives. We need to be incredibly careful about what we do to regulate AI and think about how it operates. I am concerned that we have large tech companies whose business model for decades has been nothing other than to use people’s data to create products for their own benefit and that of their shareholders. During the passage of the Online Safety Act 2023, we debated very fully in this House what the implications of the algorithms they develop might be for our children’s health, for example.
I completely agree with the Government that we should be looking for ways to stamp out fraud, and should think about how harms of various kinds are addressed. However, we need to be mindful of the big risk that fears and beliefs that are not necessarily true about different potential harms might lead us to regulate, or to guide the operations of companies and others, in such a way that we create real problems. We are talking about very capable artificial intelligence systems, and also about artificial intelligence systems that claim to be very capable but are inherently flawed. The big tech companies are almost all championing and sponsoring large language models for artificial intelligence systems that are trained on data. Those companies will lobby Ministers all the time, saying, “We want you to enable us to get more and more of people’s data,” because that data is of business value to them.
Given the Henry VIII powers that exist in the Bill, there is a clear and present danger that future Ministers— I would not cast aspersions on the current, eminent occupant of the Front Bench, who is a Wykehamist to boot—may be tempted or persuaded in the wrong direction by the very powerful data-generated interests of those big tech firms. As such, my amendments 278 and 279 are designed to remove from the Bill what the Government are proposing: effectively, that Ministers will have the power to totally recategorise what kinds of data can legitimately be shared with third parties of one kind or another. As I mentioned, that fundamentally changes the balance between individuals and the state.
Through amendment 280 and new schedule 3, I propose that when Ministers implement the trust framework within the digital verification service, that framework should be based on principles that have been accepted for the eight years since I was elected—in particular, those used by the Government in establishing the framework around its Verify online identity service for public services. That framework should be used in the context of the Bill to think about what decision-makers should be taking into account. It is a system of principles that has been through consultation and has been broadly accepted. It is something that the ICO accepts and champions, and it would be entirely right and not at all a divergence from our current system to put those principles in place.
What I would say about the legitimate interest recognition extension—the Henry VIII power—is that there are already indications in the Bill about what will be recategorised. It gives an idea of just how broad the categorisations could be, and therefore how potentially dangerous it will be if that process is not followed or is not correctly framed—for example, in relation to direct marketing. Direct marketing can mean all sorts of things, but it is essentially any type of direct advertising in any mode using personal data to target advertising, and I think it is really dangerous to take such a broad approach to it.
Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that. I am very pro-innovation and pro-efficiency, but I do not believe it is inefficient for companies and users or holders of data to have to make those basic balancing judgments. It is no skin off their nose at all. This should be something we uphold because these interests are vital to our human condition. The last thing we want is an artificial intelligence model—a large language model—making decisions about us, serving us with things based on our personal data and even leaking that personal data.
I highlight that only yesterday or the day before, a new academic report was produced showing that some of the large language models were leaking personal data on which they had been trained, even though the companies say that that is impossible. The researchers had managed to get around the alignment guardrails that these AI companies said they had in place, so we cannot necessarily believe what the big tech companies say the behaviour of these things is going to be. At the end of the day, large language models, which are just about statistics and correlations, cannot tell us why they have done something or anything about the chain of causality behind such a situation, and they inherently get things wrong. Anyone making claims that they are reliable or can be relied on to handle personal data is, I think, completely wrong. I hope that noble Lords and Ladies will think carefully about that matter and re-table amendments similar to mine.
New clause 27 and the following new clauses that the Government have tabled on interface bodies show the extent to which these new systems—and decisions about new systems—and how they interface with different public services and other bodies are totally extensible within the framework of the Bill, without further regard to minorities or to law, except in so far as there may be a case for judicial review by an individual or a company. That really is the only safeguard that there will be under these Henry VIII clauses. The interface body provisions talk about authorised parties being able to share data. We have heard how the cookie system is very bad at the moment at effectively reflecting what individuals’ true preferences might or might not be about their personal data. It is worth highlighting the thoughtful comments we heard earlier about ways in which people can make more of a real-time decision about particular issues that may be relevant to them, but about which they may not have thought at all when they authorised such a decision in a dark or non-thinking moment, often some time before.
I have now to announce the result of today’s deferred Division on the Draft Strikes (Minimum Service Levels: NHS Ambulance Services and the NHS Patient Transport Service) Regulations 2023. The Ayes were 297 and the Noes were 166, so the Ayes have it.
[The Division list is published at the end of today’s debates.]
I rise to speak specifically to Government new clause 34 and connected Government amendments which, as we have been reminded, give Ministers power to inspect the bank accounts of anyone claiming a social security benefit. I think it has been confirmed that that includes child benefit and the state pension, as well as universal credit and all the others. Extremely wide powers are being given to Ministers.
The Minister told us that the measure is expected to save some half a billion pounds over the next five years. I was pleased that the Minister for Disabled People, Health and Work was present at the start of the debate, although he is not now in his place and the Department for Work and Pensions is not hearing the concerns expressed about this measure. The Minister for Data and Digital Infrastructure told us that the Minister for Disabled People, Health and Work will not be not speaking in the debate, so we will not hear what the DWP thinks about these concerns.
We have also been told—I had not seen this assurance—that these powers will not be used for a few years. If that is correct, I am completely mystified by why this is being done in such a way. If we had a few years to get these powers in place, why did the Government not wait until there was some appropriate draft legislation that could be properly scrutinised, rather than bringing such measures forward now with zero Commons scrutiny and no opportunity for that to occur? There will no doubt be scrutiny in the other place, but surely a measure of this kind ought to undergo scrutiny in this House.
I chair the Work and Pensions Committee and we have received substantial concerns about this measure, including from Citizens Advice. The Child Poverty Action Group said that
“it shouldn’t be that people have fewer rights, including to privacy, than everyone else in the UK simply because they are on benefits.”
I think that sums up what a lot of people feel, although it appears to be the position that the Government are now taking. It is surprising that the Conservative party is bringing forward such a major expansion of state powers to pry into the affairs of private citizens, and particularly doing so in such a way that we are not able to scrutinise what it is planning. As we have been reminded, the state has long had powers where there were grounds for suspecting that benefit fraud had been committed. The proposal in the Bill is for surveillance where there is absolutely no suspicion at all, which is a substantial expansion of the state’s powers to intrude.
Annabel Denham, deputy comment editor at The Daily Telegraph warned in The Spectator of such a measure handing
“authorities the power to snoop on people’s bank accounts.”
I suspect that the views expressed there are more likely to find support on the Conservative Benches than on the Labour Benches, so I am increasingly puzzled by why the Government think this is an appropriate way to act. I wonder whether the fact that there have been such warnings prompted Ministers into rushing through the measure in this deeply unsatisfactory way, without an opportunity for proper scrutiny, because they thought that if there had been parliamentary scrutiny there would be substantial opposition from the Conservative Benches as well as from the Labour Benches. It is difficult to understand otherwise why it is being done in this way.
As we have been reminded, new clause 34 will give the Government the right to inspect the bank account of anyone who claims a state pension, which is all of us. It will give the Government the right to look into the bank account of every single one of us at some point during our lives, without suspecting that we have ever done anything wrong, and without telling us that they are doing it. The Minister said earlier that the powers of the state should be limited to those absolutely necessary, and I have always understood that to be a principle of the Conservative party. Yet on the power in the new clause to look into the bank account of everybody claiming a state pension, he was unable to give us any reason why the Government should do such a thing, or why they would ever need to look into the bank accounts of people—everybody—claiming a state pension. What on earth would the Government need to do that for? The entitlement to the state pension is not based on income, savings or anything like that, so why would the Government ever wish to do that?
If we cannot think of a reason why the Government would want to do that, why are they now taking the power to enable them to do so? I think that all of us would agree, whatever party we are in, that the powers of the state should be limited to those absolutely necessary. The power in the new clause is definitely not absolutely necessary. Indeed, no one has been able to come up with any reason for why it would ever be used.
There is something called a production order. If somebody was under investigation for benefit fraud, an application could be made before a court for the production of bank accounts. If it was a matter of suspected fraud, there is already a mechanism available.
Yes, there is a clear and long-established right in law for the DWP to look into people’s bank accounts if there is a suspicion of fraud. This power is giving the Department the ability to look into the bank accounts of people where there is no suspicion at all. All of us at some point in our lives claim a social security benefit, and we are giving the Government the power to look into our bank accounts with this measure.
I rise to speak to new clause 1 in my name and that of other colleagues. Earlier this year, I met with members of Leicestershire Police Federation, who raised concerns about elements of the Data Protection Act 2018 that were imposing unnecessary and burdensome redaction obligations on police forces. I thank the national Police Federation for its tireless campaigning on this issue, particularly Ben Hudson of Suffolk police, and I thank my hon. Friend the Member for Waveney (Peter Aldous) for all he has done in this area. I thank them for much of the information I will share today.
As I explained in Committee, part 3 of the 2018 Act implemented the law enforcement directive and made provision for data processing by competent authorities, including police forces and the Crown Prosecution Service, for law enforcement purposes. Paragraph (4) of the enforcement directive emphasised that the
“free flow of personal data between competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences…should be facilitated while ensuring a high level of protection of personal data.”
However, part 3 of the 2018 Act contains no provision at all to facilitate the free flow of personal data between the police and the CPS. Instead, it imposes burdensome obligations on the police, requiring them to redact personal data from information transferred to the CPS. Those obligations are only delaying and obstructing the expeditious progress of the criminal justice system and were not even mandated by the law enforcement directive.
The problem has arisen due to chapter 2 of part 3 of the 2018 Act, which sets out six data protection principles that apply to data processing by competent authorities for law enforcement purposes. Section 35(1) states:
“The first data protection principle is that the processing of personal data for any of the law enforcement purposes must be lawful and fair.”
Section 35(2) states:
“The processing of personal data for any of the law enforcement purposes is lawful only if and to the extent that it is based on law and either—
(a) the data subject has given consent to the processing for that purpose, or
(b) the processing is necessary for the performance of a task carried out for that purpose by a competent authority.
The Police Federation has said that it is unlikely that section 35(2)(a) will apply in this context. It has also said that in the case of 35(2)(b), the test of whether the processing is “necessary” is exacting, requiring a competent authority to apply its mind to the proportionality of processing specific items of personal data for the particular law enforcement purpose in question.
Under sections 35(3) to 35(5), where the processing is “sensitive processing”, an even more rigorous test applies, requiring among other things that the processing is
“strictly necessary for the law enforcement purpose”
in question. Section 37 states:
“The third data protection principle is that personal data processed for any of the law enforcement purposes must be adequate, relevant and not excessive in relation to the purpose for which it is processed.”
For the purposes of the 2018 Act, the Crown Prosecution Service and each police force are separate competent authorities and separate data controllers. Therefore, as set out in section 34(3), the CPS and each police force must comply with the data protection principles. A transfer of information by a police force to the CPS amounts to the processing of personal data.
The tests of “necessary” and “strictly necessary” under the first and third data protection principles require a competent authority to identify and consider each and every item of personal data contained within the information that it is intended to process and to consider whether it is necessary for that item of personal data to be processed in the manner intended. The impact of this is that when preparing a case file for a charging decision from the CPS, the police must spend huge amounts of time and resources analysing information that has been gathered by investigating officers in order to identify every item of personal data. They then have to decide whether it is necessary or, in many cases strictly necessary, for the CPS to consider each item of personal data when making its charging decision, and to redact every item of personal data that does not meet that test.
The National Police Chiefs’ Council and the CPS have produced detailed guidance on this redaction process. It emphasises that the 2018 Act is a legal requirement and that the police and the CPS do not have any special relationship that negates the need to redact and protect personal information. The combination of the requirements of the guidance and of the Act represent a huge amount of administrative work for police officers, resulting in hours of preparing appropriate redactions. Furthermore, such work is inevitably carried out by relatively junior officers who have no particular expertise in data protection, and much of it may never be used by the CPS if the matter is not charged or if the defendant pleads guilty before trial. Nationally, about 25% of cases that are submitted to the CPS are not charged. A significant proportion of that time and money could be saved if the redaction of personal data by the police occurred after, rather than before, a charging decision has been made by the CPS.
The burden that this is placing on police forces was highlighted in the 2022 “Annual Review of Disclosure” by the Attorney General’s Office, which heard evidence from police that
“redaction of material for disclosure is placing a significant pressure on resources”.
It also found that one police force had invested £1 million in a disclosure specialist team solely to deal with redaction. In its report on policing priorities, the Home Affairs Committee stated:
“The National Police Chiefs’ Council and the College of Policing said this ‘labour-intensive’ process ‘ties up police resources for a protected period of time’, meaning investigations take longer, and possibly adds to the likelihood of victims withdrawing their support for a case. The College noted that the problem has become worse as digital devices such as phones and laptops have developed ever greater storage capacity, meaning there is more data for the police to process and redact. Disparities in digital capabilities across the 43 local forces also exacerbate the problem.”
The report went on to say:
“Lengthy and inefficient redaction processes and protracted investigations are neither effective nor fair on either victims or suspects. The handling of case files needs to comply with data protection laws. However, ensuring that the requirements are proportionate and that forces have the digital capacity to meet such requirements efficiently is an urgent issue that needs addressing. More needs to be done to pilot solutions and get the balance right.”
Furthermore, the Police Federation and the National Police Chiefs’ Council estimate that the cost nationally of the redaction exercise is over £5.6 million per annum. There is no disputing that there is a clear issue here, and I welcome that this has been acknowledged by Ministers I have been engaging with, including the Minister for Crime, Policing and Fire, my right hon. Friend the Member for Croydon South (Chris Philp); the former Home Secretary, my right hon. and learned Friend the Member for Fareham (Suella Braverman); and the Minister for Data and Digital Infrastructure, the right hon. Member for Maldon (Sir John Whittingdale). Only last week, the latter emphasised to me the Government’s support for reform.
Indeed, the autumn statement last week highlighted the Government’s commitment to boosting public sector productivity by running an ambitious public sector productivity programme with all Departments to reimagine the way public services are delivered. The focus of that will be on
“reducing the amount of time our key frontline workers, including police, doctors, and nurses, spend on administrative tasks”.
That is to ensure that they can spend more time delivering for the public. Arguably, the current process of data redaction is the biggest unnecessary administrative task keeping police officers away from the frontline, so reform needs to be implemented urgently.
My new clause lays out a blueprint for that reform and would insert a proposed new section into the 2018 Act to exempt the police service and the CPS from complying with the first data protection principle—except in so far as that principle requires processing to be fair—or with the third data protection principle when preparing a case file for submission to the CPS for a charging decision, thereby facilitating the free flow of personal data between the police and the CPS. If the CPS decided to charge, the case file would be returned to the police to carry out the redaction exercise before there was any risk of the file being disclosed to any person or body other than the CPS. In the 25% of cases in which the CPS decides not to charge, the unredacted file would simply be deleted by the CPS.
My new clause would have no obvious disadvantages, as the security of the personal data would not be compromised and the necessary redactions would still be undertaken once a charging decision had been made. Furthermore, providing material unredacted to the CPS pre-charge would not impact the timeliness of the process in any way, as the police would still be providing the same material to the CPS as they would have done previously, just unredacted.
I know from my conversations with Ministers that there are a few questions from a number of sources about whether legislative change is the best way to tackle the issues surrounding redaction. To that, the Police Federation has said that
“the hope is that the CPS will set out, within their charging advice, what material they intend to rely upon and, therefore, only the required material will have to be redacted by the police. This would be done in line with the maximum time of service set out within the ‘Better case management handbook Jan 23’, which states that service is required no less than five days before the hearing. So we must accept that there may be a slight delay in the CPS being able to serve their case on the defence at the point of charge. But the time in which it will take police forces to apply for a charging decision to the CPS will be far quicker without the need for redact. Thus, stopping defendants being on bail or under ‘released under investigation’ status for as long as they currently are and victims of crime waiting less time for charging decisions.”
In addition, the Police Federation has highlighted that while auto-redaction software will help to mitigate the current issues, it will not recover all policing capacity in respect of redaction. Officers will still need to review the item to consider what auto-redaction parameters need applying, otherwise police could risk ending up with mass over-redaction, and having to check to ensure nothing has been missed. The real benefit for auto-redaction software will come post-charge, especially if the CPS states exactly what material it intends to use or disclose.
I also appreciate that the Government feel they cannot support my amendment because of three technical legal points, and I would like to summarise the Police Federation’s response to this, based on advice from its leading counsel who are experienced in the field of data protection and privacy.
The Government’s first objection is that there are provisions in the 2018 Act, other than the first and third data protection principles, that
“in effect require the material concerned to be reviewed and redacted”.
The two examples given by the Home Office were the sixth data protection principle and section 44. The sixth data protection principle—data security—does not require case files to be redacted. The same standard of
“appropriate technical or organisational measures”
is required whether case files are redacted before or after the CPS has made a charging decision. The Police Federation’s leading counsel has pointed out that section 44(4) of the Act already contains potentially relevant restrictions on a data subject’s rights. Those restrictions during an investigation would be consistent with an amendment providing for the police to redact any given case file only after the CPS has decided to charge.
I rise to speak to the six amendments that I have tabled to the Bill. I am grateful to Mr Speaker for selecting amendment 11, which I will press to a vote. It is an extremely important amendment that I hope will unite Members across the House, and I thank the hon. Member for Glasgow North (Patrick Grady) for confirming his party’s support for it.
I thank my hon. Friend for that.
I have been contacted by many people and organisations about issues with the Bill. The British Medical Association and the National AIDS Trust have serious concerns, which I share, about the sharing of healthcare data and the failure to consider the negative impact of losing public trust in how the healthcare system manages data.
The Bill is an opportunity to adapt the UK’s data laws to strengthen accountability and data processing, but it currently fails to do so. It provides multiple Henry VIII powers that will enable future Secretaries of State to avoid parliamentary scrutiny and write their own rules. It undermines the independence of the Information Commissioner’s Office in a way that provides less protection to individuals and gives more power to the Government to restrict and interfere with the role of the commissioner.
The Government’s last-minute amendments to their own Bill, to change the rules on direct marketing in elections and give themselves extensive access to the bank accounts of benefit claimants, risk alienating people even further. I hope the House tells Ministers that it is entirely improper—in fact, it is completely unacceptable—for the Government to make those amendments, which require full parliamentary scrutiny, at this late stage.
We know people already do not trust the Government with NHS health data. The Bill must not erode public trust even more. We have seen concerns about data with GP surgeries and the recent decision to award Palantir the contract for the NHS’s federated data platform. A 2019 YouGov survey showed that only 30% of people trust the Government to use data about them ethically. I imagine that figure is much lower now. How do the Government plan to establish trust with the millions of people on pension credit, state pension, universal credit, child benefit and others whose bank accounts—millions of bank accounts—they will be able to access under the Bill? As my hon. Friend the Member for Rhondda (Sir Chris Bryant) and others have asked, legislative powers already exist where benefit fraud is suspected, so why is the amendment necessary?
My amendment 11 seeks to ensure that special category data, such as that relating to a person’s health, is adequately protected in workplace settings. As the Bill is currently worded, it could allow employers to share an employee’s personal data within their organisation without a justifiable reason. The health data of all workers will be at risk if the amendment falls. We must ensure that employees’ personal data, including health data, is adequately protected in workplace settings and not shared with individuals who do not need to process it.
The National AIDS Trust is concerned that the Bill’s current wording could mean that people’s HIV status can be shared without their consent in the workplace, using the justification that it is “necessary for administrative purposes”. That could put people living with HIV at risk of harassment and discrimination in the workplace. The sharing of individuals’ HIV status can lead to people living with HIV experiencing further discrimination and increase their risk of harassment or even violence.
I am concerned about the removal of checks on the police processing of an individual’s personal data. We must have such checks. The House has heard of previous incidents involving people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and in the wider communities they serve. Ensuring that police officers must justify why they have accessed an individual’s personal data is vital for evidence in cases of police misconduct, including where a person’s HIV status is shared inappropriately by the police or when not relevant to an investigation into criminal activity.
The Bill is not robust enough on the transfer of data internationally. We need to ensure that there is a mandated annual review of the data protection test for each country so that the data protection regime is secure, and that people’s personal data, such as their LGBTQ+ identity or HIV status, will not be shared inappropriately. LGBTQ+ identities are criminalised in many countries, and the transfer of personal data to those countries could put an individual, their partner or their family members at real risk of harm.
I have tabled six amendments, which would clarify what an “administrative purpose” is when organisations process employees’ personal data; retain the duty on police forces to justify why they have accessed an individual’s personal data; ensure that third countries’ data protection tests are reviewed annually; and ensure that the Secretary of State seeks the views of the Information Commissioner when assessing other countries’ suitability for the international transfer of data. I urge all Members to vote for amendment 11, and I urge the Government and the other place to take on board all the points raised in today’s debate and in amendments 12 to 16 in my name.
I rise to speak to new clause 2, which, given its low number, everyone will realise I tabled pretty early in the Bill’s passage. It addresses the smart data clauses that sit as a block in the middle of the Bill.
It is wonderful to see the degree of cross-party support for the smart data measures. The shadow Minister’s remarks show that the Labour Front Bench have drunk deeply from the Kool-Aid, in the same way as the rest of us. It is vital that the measures move forward as fast and as safely as possible, because they have huge potential for our economy and our GDP growth. As the Minister rightly said, they seek to build on the undoubted world-leading success of our existing position in open banking.
My new clause is fairly straightforward, and I hope that the Minister will elaborate in his closing remarks on the two further measures that it seeks, which I and a number of other people urged the Secretary of State to take in a letter back in July. To underline the breadth of support for the measures, the letter was signed by the chief data and analytics officer of the NatWest Group, leading figures in the Financial Data and Technology Association, the co-founder and chief executive officer of Ozone API, the director general of the Payments Association, the founder and chief executive of Icebreaker One—who is, incidentally, now also chair of the Smart Data Council—the founder of Open Banking Excellence, and the CEO of the Investing and Saving Alliance. I am making not only a cross-party point, but a point that has widespread support among the very organisations involved in smart data, and particularly the open banking success that we all seek to replicate.
If we are to replicate our success in open banking across other parts of our economy, we need two things to be true. First, we must make sure that all data standards applied in other sectors are interoperable with the data standards that already exist in open banking. The point is that data standards will be different in each sector, because each sector’s data is held in different ways, in different places and by different people, under different foundational legal powers, but they must all converge on a set of standards that means that health data can safely and securely talk to, say, energy data or banking data.
Following on from my earlier intervention, when the Minister was talking about Government new clause 27, if we are to have data standards that allow different bits of data to be exchanged safely and securely, it is essential that we do not end up with siloed standards that do not interoperate and that cannot talk to each other, between the different sectors. Otherwise, we will completely fail to leverage our existing lead in open banking, and we will effectively have to reinvent the wheel from scratch every time we open up a new sector.
I hope that, by the time the Minister responds to the various points raised in this debate, inspiration will have struck and he will be able to confirm that, although we might have different data standards, it is the Government’s intention that those standards will all be interoperable so that we avoid the problem of balkanisation, if I can put it that way. I hope he will be able to provide us with a strong reassurance in that direction.
I agree with the hon. Gentleman on this, but quite a lot of steps need to be taken here. For instance, we might need to mandate standards on smart meters in order to be able to take advantage of these measures. We have not been given any kind of plans so far—unless he has seen something.
I wish I had seen something, because then I would be able to pull my amendment or inform the House. I have not seen something, and I think such a plan is essential, not just for Members in the Chamber this afternoon, but for all those investors, business leaders and app developers. That would allow them to work out the critical path, whatever the minimum viable products might be and everything else that is going to be necessary, and by what date, for the sectors they are aiming for. So the hon. Gentleman is absolutely right in what he says, and it is vital that if the Minister cannot come up with the timetable this afternoon, he can at least come up with a timetable for the timetable, so that we all know when the thing will be available and the rest of the open banking industry can work out how it is going to become an “open everything” industry and in what order, and by what time.
So this is fairly straightforward. There are promising signs, both in the autumn statement and in the Government’s new clause 27, but further details need to be tied down before they can be genuinely useful. I am assuming, hoping and praying that the Minister will be able to provide some of those reassurances and details when he makes his closing remarks, and I will therefore be able to count this as a probing amendment and push it no further. I am devoutly hoping that he will be able to make that an easier moment for me when he gets to his feet.
I apologise to right hon. and hon. Members for any confusion that my movements around the Chamber may have created earlier, Mr Deputy Speaker.
New clause 45 is about the comparability and interoperability of health data across the UK. I say to the hon. Member for Rhondda (Sir Chris Bryant), the Opposition spokesman, that I have never been called pregnant before—that is a new description—but I will return to his point shortly in these brief remarks. There are three important reasons worth stating why data comparability is important. The first is that it empowers patients. The publication of standardised outcomes gives patients the ability to make informed choices about their treatment and where they may choose to live. Secondly, it strengthens care through better professional decision making. It allows administrators to manage resources and scientists to make interpretations of the data they receive. Thirdly, comparable data strengthens devolution, administration and policy making in the health sector. Transparent and comparable data is essential for that and ensures that we, as politicians, are accountable to voters for the quality of services in our area.
We could have an academic and philosophical discussion about this, but what brings me to table new clause 45 is the state of healthcare in north Wales. We have a health board that has been in special measures for the best part of eight years, and I have to wonder if that would be the case if the scrutiny of it were greater. One of the intentions of devolution was to foster best practice, but in order for that to happen we need comparability, which has not proved to be the case in the health sector.
For example, NHS Scotland does not publish standard referral to treatment times. Where it does, it does not provide averages and percentiles, but rather the proportion of cases meeting Scotland-only targets. In Wales, RTTs are broadly defined as the time spent waiting between a referral for a procedure and getting that procedure. In England, only consultant-led pathways are reported, but in Wales some non-consultant-led pathways are included, such as direct access diagnostics and allied health professional therapies, such as physiotherapy and osteopathy, which inevitably impact waiting times.
On cancer waiting times, England and Scotland have a target of a test within six weeks. However, there are different numbers of tests—eight north and 15 south of the border—and different measures for when the period ends—until the last test is completed in England or until the report is written up in Scotland. Those who understand health matters will make better sense of what those differences mean, but I simply make the observation that there are differences.
In Wales, the way we deal with cancer waiting times is different. Wales starts its 62-day treatment target from the date the first suspicion is raised by any health provider, whereas in England the 62-day target is from the date a specialist receives an urgent GP referral. Furthermore, in Wales routine referrals reprioritised as “urgent, with suspicion of cancer” are considered to be starting a new clock.
What can be done about this and why does it require legislation? New clause 45 may seem familiar to hon. Members because it was first brought forward as an amendment to the Health and Care Bill in 2022. It was withdrawn with the specific intention of giving the Government the time to develop a collaborative framework for sharing data with the devolved Administrations. I pay tribute to all four Governments, the Office for National Statistics and officials for their work since then.
Notwithstanding that work, on 5 September 2023 Professor Ian Diamond, the UK national statistician, made the following remarks to the Public Administration and Constitutional Affairs Committee about gathering comparative health data across the devolved Administrations:
“You are entirely right that statistics is a devolved responsibility and therefore the data that are collected for administrative purposes in different parts of the United Kingdom differ. We have found it very difficult recently to collect comparable data for different administrations across the UK on the health service, for example.”
On working more closely with the devolved Administrations’ own statistical authorities, he said:
“We have been working very hard to try to get comparable data. Comparable data are possible in some areas but not in others. Trying to get cancer outcomes—”
as I have just referred to—
“is very difficult because they are collected in different ways… While statistics is devolved, I do not have the ability to ensure that all data are collected in a way that is comparable. We work really hard to make comparable data as best as possible, but at the moment I have to be honest that not all data can be compared.”
Mr Deputy Speaker, new clause 45 was brought forward as a constructive proposal. I believe that it is good for the patients, good for the professionals who work on their healthcare, and good for our own accountability. I do not think that this House would be divided on grounds of compassion or common sense. I thank all those Members who have supported my new clause and urge the Government to legislate on this matter. Today was an opportunity for me to discuss the issues involved, but I shall not be moving my new clause.
With the leave of the House, I call the Minister to wind up the debate.
I thank all hon. Members who have contributed to the debate. I believe that these matters are important, if sometimes very complicated and technical. My hon. Friend the Member for Yeovil (Mr Fysh) was absolutely right to stress how fundamentally important they are, and they will become more so.
I also thank the shadow Minister for identifying the areas where we are in agreement. We had a good Committee stage with his colleague, the hon. Member for Barnsley East (Stephanie Peacock), where we agreed on the overall objectives of the Bill. It is welcome that the shadow Minister has supported us, particularly on the amendment that we moved this afternoon on the powers of the Information Commissioner’s Office, the provisions relating to digital verification services, and smart data. There were, however, some areas on which we will not agree.
Let me begin by addressing the main amendments that the hon. Gentleman has moved. Amendment 1 relates to high-risk processing. It is the case that one of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate only senior responsible individuals to carry out risk assessments and keep records of processing when their activities pose high risks to individuals. The amendments that the hon. Gentleman is proposing would reintroduce a prescriptive list of high-risk processing activities drawn from article 35 of the UK GDPR. We find that some of the language in article 35 is unclear and confusing, which is partly why we removed it in the first place. We think organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing in the legislation, because any list could quickly become out of date. Instead, to help data controllers, clause 18 of the Bill requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing.
But the Minister has already indicated that, basically, he will come forward with exactly the same list as is in the legislation that the Government are amending. All that is happening is that, in the Bill, the Information Commissioner will be doing what the Government or the House could be doing, and this is the one area where the Government disagree with the Information Commissioner.
As I say, the Government do not believe that it is necessary to have a prescriptive list in the Bill. We feel that it is better that individuals make a judgment based on their assessment of the risk, with the guidance of the Information Commissioner.
Moving to the shadow Minister’s second amendment, the Government agree that controllers should not be able to refuse a request without proper thought or consideration. That is why the existing responsibilities of controllers to facilitate requests from data subjects as the default has not changed and why the new article 12A also ensures that the burden of proof for a request meeting the vexatious or excessive threshold remains with the controller. The Government believe that is sufficient, and stipulating that evidence must be provided each time a request is refused may not be appropriate in all circumstances and would likely bring further burdens for controllers. On that basis, we oppose that amendment.
On amendment 5, the safeguards set out in reformed article 22 of the UK GDPR ensure that individuals are able to seek human intervention when significant decisions about them are taken solely through automated means with no meaningful human involvement.
Partly automated decisions already involve meaningful human involvement, so there is no need to extend the safeguards in article 22 to all forms of automated decision making. In such instances, other data protection requirements continue to apply and offer relevant protections to data subjects, as set out in the broader UK data protection regime. Those protections include lawfulness, fairness, transparency and accountability.
My understanding was that the level of fraud among state pension claims was indeed extremely small. The Minister said earlier that the Government should take powers only where they are absolutely necessary; I think he is now saying that they are not necessary in the case of people claiming a state pension. Is he confident that that bit of this power—to look into the bank account of anybody claiming a state pension—is absolutely necessary?
What I am saying is that the Government’s intention is to use the power only when there is clear evidence or suggestion that fraud is taking place on a significant scale. The Government simply want to retain the option to amend that should future evidence emerge; that is why the issue has been left open.
The trouble is that this is not about amending. The Government describe the relevant benefits in part 5 of proposed new schedule 3B, within new schedule 1, which is clear that pensions are included. The Minister has effectively said at the Dispatch Box that the Government do not need to tackle fraud in relation to pensions; perhaps it would be a good idea for us to all sit down and have a meeting to work out a more sensible set of measures to tackle fraud where it is necessary, rather than giving unending powers to the Government.
I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future. But I am happy to take the hon. Gentleman up on his request on behalf of my hon. Friend the Minister for Disabled People, Health and Work, with whom he has already engaged. I am sure that the right hon. Member for East Ham will want to examine the issue further in the Work and Pensions Committee, which he chairs. It will undoubtedly also be subject to further discussions in the other place. We are certainly open to further discussion.
The right hon. Member for East Ham also raised the question of commencement. I can tell him that the test and learn phase will begin in 2025, with a steady roll-out to full-scale delivery by 2030. I am sure that he will want to examine these matters further.
The amendment tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) focuses on digital exclusion. The Bill provides for the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them. Individual choice is integral to our approach. As the Bill makes clear, digital verification services can be provided only at the request of the individual. Where people want to use a digital verification service, the Government are committed to ensuring that available products and services are secure and privacy-focused. That is to be achieved through the high standards set out in the trust framework.
The trust framework also outlines how services can improve inclusion, and requires services to publish an annual inclusion monitoring report. There are businesses that operate only in the digital sphere, such as some online banks and energy companies, as I think has been acknowledged. We feel that to oblige them to offer manual document checking would place obligations on businesses that go beyond the Government’s commitment to do only what is necessary to enable the digital market to grow.
On amendment 224 from the Scottish National party, solely automated decision making that produces legal or similarly significant effects on individuals was not entirely prohibited previously under the UK’s data protection legal framework. The rules governing article 22 are confusing and complex, so clause 12 clarifies and simplifies the rules related to solely automated decision making, and will reduce barriers to responsible data use, help to drive innovation, and maintain high standards of data protection. The reforms do not water down any of the protections to data subjects offered under the broader UK data protection regime—that is, UK GDPR and the Data Protection Act 2018.
On the other amendment tabled by the SNP, amendment 229, effective independent oversight of surveillance camera systems is crucial to public trust. The oversight framework is complex and confusing for the police and public because of substantial duplication between the surveillance camera commissioner functions and the code, which covers police and local authorities in England and Wales only, and the ICO and data protection legislation. The Bill addresses that, following public consultation, through abolishing the surveillance camera commissioner and code.
The amendment tabled by the hon. Member for Glasgow North would negate that by retaining the code and transferring the surveillance camera commissioner functions to the investigatory powers commissioner. It would also blur the lines between overt and covert surveillance, which the investigatory powers commissioner oversees. Those two types of surveillance have distinct legislation and oversight, mainly because covert surveillance is generally considered to be significantly more intrusive.
On amendment 222, it is important to be clear that the ability to refuse or charge a reasonable fee for a request already exists, and clause 8 does not place new restrictions on reasonable requests from data subjects. The Government believe that it is proportionate to allow controllers to refuse or charge a reasonable fee for vexatious or excessive requests, and a clearer provision enables controllers to focus time and resources on responding to reasonable requests instead.
Amendments 278 and 279, tabled by my hon. Friend the Member for Yeovil, would remove the new lawful ground of recognised legitimate interests, which the Bill will add to article 6 of UK GDPR. Amendment 230 accepts that there is merit in retaining the recognised legitimate interests list, but would make any additions to it subject to a super-affirmative parliamentary procedure. It is true that the Bill removes the need for non-public-sector organisations to do a detailed legitimate interests assessment in relation to a small number of processing activities. Those include activities relating for example to the safeguarding of children, crime prevention and responding to emergencies. We heard from stakeholders that the need to do an assessment and the fear of getting it wrong could sometimes delay or deter those important processing activities from taking place. Future Governments would not be able to add new activities to the list lightly; clause 5 of the Bill already makes it clear that the Secretary of State must carefully consider the rights and interests of people, and in particular the special protection needed for children, before adding anything new to the list. Any new regulations would also need to be approved via the affirmative resolution procedure.
My hon. Friend the Member for Yeovil has tabled a large number of other amendments, which are complicated in nature. I have written to him in some detail setting out the Government’s response to each of those, but if he wishes to pursue further any of the points contained therein I would be very happy to have further discussions with him.
I would like to comment on the amendments by several of my colleagues that I wish I was in a position to be able to support. In particular, my hon. Friend the Member for Loughborough (Jane Hunt) has been assiduous in pursuing her point both in the Bill Committee and in this debate. The problem she identifies is without question a very real one, and she set out in some detail how it is massively increasing the burden on the police, which clearly we would wish to reduce wherever possible.
I have had meetings with Home Office Ministers, as my hon. Friend has, and they absolutely identify that problem and share her wish. While we welcome her intent, the problem is that we do not think that her amendment as drafted would achieve her aims of removing the burden of redaction. To do so would require the amendment and exception of more principles than those identified in the amendment. Indeed, it would require the amendment of more laws than just the Data Protection Act 2018.
The Government are absolutely committed to reducing the burden on the police, but it is obviously important that, if we do so, we do it right, and that the solution works comprehensively. We are therefore actively working on ways to better address the issue, including through improved process, new technology, guidance and legislation. I am very happy to continue to work with her on achieving the aim that we all share and so too, I know, are colleagues in the Home Office.
With respect to the amendments tabled by my hon. Friend the Member for Weston-super-Mare (John Penrose), as I indicated, we absolutely share his enthusiasm for smart data and ensuring that the powers within the Bill are implemented in a timely manner, with interoperability at their core. While I agree that we can only fully realise the benefits of smart data schemes if they enable interoperability, different sectors will have different levels of existing digital infrastructure and capability. Thus, we could inadvertently hinder the success of future schemes if we mandated the use of one universal set of standards based, for instance, on those used in open banking.
The Government will ensure that interoperability is central to the development of smart data schemes. To support our thinking, we are working with industry and regulators in the Smart Data Council to identify the technical infrastructure that needs to be replicated. With regard to the timeline—or even the timeline for a timeline—that my hon. Friend asked for, I recognise that it is important to build investor, industry and consumer confidence by outlining the Government’s planned timeline.
My hon. Friend is right to highlight the Chancellor’s comments in the autumn statement, where we set out plans to kick-start the smart data big bang, and our ambition for using those powers across seven sectors. At this stage I am afraid I am not able to accept his amendment, but it is our intention to set out those plans in more detail in the coming months. I know the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake) and I will be happy to work with him to do so.
The aim of the amendment tabled by the hon. Member for Jarrow (Kate Osborne) was to clarify that, when special category data of employees such as health data is transferred between members of a group of undertakings for internal administrative purposes on grounds of legitimate interests, the conditions and safeguards outlined in schedule 1 of the Data Protection Act should apply to that processing. The Government agree with the sentiment of her amendment, but consider that it is unnecessary. The current legal framework already requires controllers to identify an exemption under article 9 of the UK GDPR if they are processing special category data. Those exemptions are supplemented by the conditions and safeguards outlined in schedule 1. Under those provisions, employers can process special category data where processing is necessary to comply with obligations under employment law. We do not therefore consider the amendment necessary.
Finally, I turn to new clause 45, tabled by my hon. Friend the Member for Aberconwy (Robin Millar). The Government are absolutely committed to improving the availability of comparable UK-wide data. He, too, has been assiduous in promoting that cause, and we are very happy to work with him. We are extremely supportive of the principle underlying his amendment. He is right to point out that people have the right to know the extent of Labour’s failings with the NHS in Wales, as he pointed out, and his new clause sends an important message on our commitment to better data. I can commit to working at pace with him and the UK Statistics Authority to look at ways in which we may be able to implement the intentions of his amendment and bring forward legislative changes following those discussions.
On that basis, I commend the Government amendments to the House.
Question put and agreed to.
New clause 6 accordingly read a Second time, and added to the Bill.
For the benefit of all Members, we are before the knife, so we will have to go through a sequence of procedures. It would help me, the Clerk and the Minister if we had a degree of silence. This will take a little time, and we need to be able to concentrate. Elected representative Candidate for election as an elected representative member of the House of Commons section 118A of the Representation of the People Act 1983 a member of the Senedd article 84(2) of the National Assembly for Wales (Representation of the People) Order 2007 (S.I. 2007/236) a member of the Scottish Parliament article 80(1) of the Scottish Parliament (Elections etc) Order 2015 (S.S.I. 2015/425) a member of the Northern Ireland Assembly section 118A of the Representation of the People Act 1983, as applied by the Northern Ireland Assembly (Elections) Order 2001 (S.I. 2001/2599) an elected member of a local authority within the meaning of section 270(1) of the Local Government Act 1972, namely— (i) in England, a county council, a district council, a London borough council or a parish council; (ii) in Wales, a county council, a county borough council or a community council; section 118A of the Representation of the People Act 1983 an elected mayor of a local authority within the meaning of Part 1A or 2 of the Local Government Act 2000 section 118A of the Representation of the People Act 1983, as applied by the Local Authorities (Mayoral Elections) (England and Wales) Regulations 2007 (S.I. 2007/1024) a mayor for the area of a combined authority established under section 103 of the Local Democracy, Economic Development and Construction Act 2009 section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67) a mayor for the area of a combined county authority established under section 9 of the Levelling-up and Regeneration Act 2023 section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67) the Mayor of London or an elected member of the London Assembly section 118A of the Representation of the People Act 1983 an elected member of the Common Council of the City of London section 118A of the Representation of the People Act 1983 an elected member of the Council of the Isles of Scilly section 118A of the Representation of the People Act 1983 an elected member of a council constituted under section 2 of the Local Government etc (Scotland) Act 1994 section 118A of the Representation of the People Act 1983 an elected member of a district council within the meaning of the Local Government Act (Northern Ireland) 1972 (c. 9 (N.I.)) section 130(3A) of the Electoral Law Act (Northern Ireland) 1962 (c. 14 (N.I.)) (n)a police and crime commissioner article 3 of the Police and Crime Commissioner Elections Order 2012 (S.I. 2012/1917) Term Provision accredited conformity assessment body section 50(7) approved supplementary code section (Approval of a supplementary code)(6) designated supplementary code section (Designation of a supplementary code)(3) digital verification services section 48(2) the DVS register section 50(2) the DVS trust framework section 49(2)(a) the main code section 49(2)(b) recognised supplementary code section (List of recognised supplementary codes)(2) supplementary code section 49(2)(c) supplementary note section (Supplementary notes)(6)” “the data protection legislation section 236”.”
New Clause 48
Processing of personal data revealing political opinions
“(1) Schedule 1 to the Data Protection Act 2018 (special categories of personal data) is amended in accordance with subsections (2) to (5).
(2) After paragraph 21 insert—
‘Democratic engagement
21A (1) This condition is met where—
(a) the personal data processed is personal data revealing political opinions,
(b) the data subject is aged 14 or over, and
(c) the processing falls within sub-paragraph (2),
subject to the exceptions in sub-paragraphs (3) and (4).
(2) Processing falls within this sub-paragraph if—
(a) the processing—
(i) is carried out by an elected representative or a person acting with the authority of such a representative, and
(ii) is necessary for the purposes of discharging the elected representative’s functions or for the purposes of the elected representative’s democratic engagement activities,
(b) the processing—
(i) is carried out by a registered political party, and
(ii) is necessary for the purposes of the party’s election activities or democratic engagement activities,
(c) the processing—
(i) is carried out by a candidate for election as an elected representative or a person acting with the authority of such a candidate, and
(ii) is necessary for the purposes of the candidate’s campaign for election,
(d) the processing—
(i) is carried out by a permitted participant in relation to a referendum or a person acting with the authority of such a person, and
(ii) is necessary for the purposes of the permitted participant’s campaigning in connection with the referendum, or
(e) the processing—
(i) is carried out by an accredited campaigner in relation to a recall petition or a person acting with the authority of such a person, and
(ii) is necessary for the purposes of the accredited campaigner’s campaigning in connection with the recall petition.
(3) Processing does not meet the condition in sub-paragraph (1) if it is likely to cause substantial damage or substantial distress to an individual.
(4) Processing does not meet the condition in sub-paragraph (1) if—
(a) an individual who is the data subject (or one of the data subjects) has given notice in writing to the controller requiring the controller not to process personal data in respect of which the individual is the data subject (and has not given notice in writing withdrawing that requirement),
(b) the notice gave the controller a reasonable period in which to stop processing such data, and
(c) that period has ended.
(5) For the purposes of sub-paragraph (2)(a) and (b)—
(a) “democratic engagement activities” means activities whose purpose is to support or promote democratic engagement;
(b) “democratic engagement” means engagement by the public, a section of the public or a particular person with, or with an aspect of, an electoral system or other democratic process in the United Kingdom, either generally or in connection with a particular matter, whether by participating in the system or process or engaging with it in another way;
(c) examples of democratic engagement activities include activities whose purpose is—
(i) to promote the registration of individuals as electors;
(ii) to increase the number of electors participating in elections for elected representatives, referendums or processes for recall petitions in which they are entitled to participate;
(iii) to support an elected representative or registered political party in discharging functions, or carrying on other activities, described in sub-paragraph (2)(a) or (b);
(iv) to support a person to become a candidate for election as an elected representative;
(v) to support a campaign or campaigning referred to in sub-paragraph (2)(c), (d) or (e);
(vi) to raise funds to support activities whose purpose is described in sub-paragraphs (i) to (v);
(d) examples of activities that may be democratic engagement activities include—
(i) gathering opinions, whether by carrying out a survey or by other means;
(ii) communicating with electors.
(6) In this paragraph—
“accredited campaigner” has the meaning given in Part 5 of Schedule 3 to the Recall of MPs Act 2015;
“candidate” , in relation to election as an elected representative, has the meaning given by the provision listed in the relevant entry in the second column of the table in sub-paragraph (7);
“elected representative” means a person listed in the first column of the table in sub-paragraph (7) and see also sub-paragraphs (8) to (10);
“election activities” , in relation to a registered political party, means—
(a) campaigning in connection with an election for an elected representative, and
(b) activities whose purpose is to enhance the standing of the party, or of a candidate standing for election in its name, with electors;
“elector” means a person who is entitled to vote in an election for an elected representative or in a referendum;
“permitted participant” has the same meaning as in Part 7 of the Political Parties, Elections and Referendums Act 2000 (referendums) (see section 105 of that Act);
“recall petition” has the same meaning as in the Recall of MPs Act 2015 (see section 1(2) of that Act);
“referendum” means a referendum or other poll held on one or more questions specified in, or in accordance with, an enactment;
“registered political party” means a person or organisation included in a register maintained under section 23 of the Political Parties, Elections and Referendums Act 2000;
“successful” , in relation to a recall petition, has the same meaning as in the Recall of MPs Act 2015 (see section 14 of that Act).
(7) This is the table referred to in the definitions of “candidate” and “elected representative” in sub-paragraph (6)—
(8) For the purposes of the definition of “elected representative” in sub-paragraph (6), a person who is—
(a) a member of the House of Commons immediately before Parliament is dissolved,
(b) a member of the Senedd immediately before Senedd Cymru is dissolved,
(c) a member of the Scottish Parliament immediately before that Parliament is dissolved, or
(d) a member of the Northern Ireland Assembly immediately before that Assembly is dissolved,
is to be treated as if the person were such a member until the end of the period of 30 days beginning with the day after the day on which the subsequent general election in relation to that Parliament or Assembly is held.
(9) For the purposes of the definition of “elected representative” in sub-paragraph (6), where a member of the House of Commons’s seat becomes vacant as a result of a successful recall petition, that person is to be treated as if they were a member of the House of Commons until the end of the period of 30 days beginning with the day after—
(a) the day on which the resulting by-election is held, or
(b) if earlier, the day on which the next general election in relation to Parliament is held.
(10) For the purposes of the definition of “elected representative” in sub-paragraph (6), a person who is an elected member of the Common Council of the City of London and whose term of office comes to an end at the end of the day preceding the annual Wardmotes is to be treated as if the person were such a member until the end of the fourth day after the day on which those Wardmotes are held.’
(3) Omit paragraph 22 and the italic heading before it.
(4) In paragraph 23 (elected representatives responding to requests)—
(a) leave out sub-paragraphs (3) to (5), and
(b) at the end insert—
‘(6) In this paragraph, “elected representative” has the same meaning as in paragraph 21A.’
(5) In paragraph 24(3) (definition of ‘elected representative’), for ‘23’ substitute ‘21A’.
(6) In section 205(2) of the 2018 Act (general interpretation: periods of time), in paragraph (i), for ‘paragraph 23(4) and (5)’ substitute ‘paragraph 21A(8) to (10)’.”—(Sir John Whittingdale.)
This new Clause inserts into Schedule 1 to the Data Protection Act 2018 (conditions for processing of special categories of personal data) a condition relating to processing by elected representatives, registered political parties and others of information about an individual’s political opinions for the purposes of democratic engagement activities and campaigning.
Brought up, read the First and Second time, and added to the Bill.
New Clause 7
Searches in response to data subjects’ requests
“(1) In Article 15 of the UK GDPR (right of access by the data subject)—
(a) after paragraph 1 insert—
‘1A. Under paragraph 1, the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that paragraph.’, and
(b) in paragraph 3, after ‘processing’ insert ‘to which the data subject is entitled under paragraph 1’.
(2) The 2018 Act is amended in accordance with subsections (3) and (4).
(3) In section 45 (law enforcement processing: right of access by the data subject), after subsection (2) insert—
‘(2A) Under subsection (1), the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that subsection.’
(4) In section 94 (intelligence services processing: right of access by the data subject), after subsection (2) insert—
‘(2ZA) Under subsection (1), the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that subsection.’
(5) The amendments made by this section are to be treated as having come into force on 1 January 2024.”—(Sir John Whittingdale.)
This new clause confirms that, in responding to subject access requests, controllers are only required to undertake reasonable and proportionate searches for personal data and other information.
Brought up, read the First and Second time, and added to the Bill.
New Clause 8
Notices from the Information Commissioner
“(1) The 2018 Act is amended in accordance with subsections (2) and (3).
(2) Omit section 141 (notices from the Commissioner).
(3) After that section insert—
‘141A Notices from the Commissioner
(1) This section applies in relation to a notice authorised or required by this Act to be given to a person by the Commissioner.
(2) The notice may be given to the person by—
(a) delivering it by hand to a relevant individual,
(b) leaving it at the person’s proper address,
(c) sending it by post to the person at that address, or
(d) sending it by email to the person’s email address.
(3) A “relevant individual” means—
(a) in the case of a notice to an individual, that individual;
(b) in the case of a notice to a body corporate (other than a partnership), an officer of that body;
(c) in the case of a notice to a partnership, a partner in the partnership or a person who has the control or management of the partnership business;
(d) in the case of a notice to an unincorporated body (other than a partnership), a member of its governing body.
(4) For the purposes of subsection (2)(b) and (c), and section 7 of the Interpretation Act 1978 (services of documents by post) in its application to those provisions, a person’s proper address is—
(a) in a case where the person has specified an address as one at which the person, or someone acting on the person’s behalf, will accept service of notices or other documents, that address;
(b) in any other case, the address determined in accordance with subsection (5).
(5) The address is—
(a) in a case where the person is a body corporate with a registered office in the United Kingdom, that office;
(b) in a case where paragraph (a) does not apply and the person is a body corporate, partnership or unincorporated body with a principal office in the United Kingdom, that office;
(c) in any other case, an address in the United Kingdom at which the Commissioner believes, on reasonable grounds, that the notice will come to the attention of the person.
(6) A person’s email address is—
(a) an email address published for the time being by that person as an address for contacting that person, or
(b) if there is no such published address, an email address by means of which the Commissioner believes, on reasonable grounds, that the notice will come to the attention of that person.
(7) A notice sent by email is treated as given 48 hours after it was sent, unless the contrary is proved.
(8) In this section “officer”, in relation to a body corporate, means a director, manager, secretary or other similar officer of the body.
(9) This section does not limit other lawful means of giving a notice.’
(4) In Schedule 2 to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016 (S.I. 2016/696) (Commissioner’s enforcement powers), in paragraph 1(b), for ‘141’ substitute ‘141A’.”—(Sir John Whittingdale.)
This amendment adjusts the procedure by which notices can be given by the Information Commissioner under the Data Protection Act 2018. In particular, it enables the Information Commissioner to give notices by email without obtaining the consent of the recipient to use that mode of delivery.
Brought up, read the First and Second time, and added to the Bill.
New Clause 9
Court procedure in connection with subject access requests
“(1) The Data Protection Act 2018 is amended as follows.
(2) For the italic heading before section 180 substitute—
‘Jurisdiction and court procedure’.
(3) After section 180 insert—
‘180A Procedure in connection with subject access requests
(1) This section applies where a court is required to determine whether a data subject is entitled to information by virtue of a right under—
(a) Article 15 of the UK GDPR (right of access by the data subject);
(b) Article 20 of the UK GDPR (right to data portability);
(c) section 45 of this Act (law enforcement processing: right of access by the data subject);
(d) section 94 of this Act (intelligence services processing: right of access by the data subject).
(2) The court may require the controller to make available for inspection by the court so much of the information as is available to the controller.
(3) But, unless and until the question in subsection (1) has been determined in the data subject’s favour, the court may not require the information to be disclosed to the data subject or the data subject’s representatives, whether by discovery (or, in Scotland, recovery) or otherwise.
(4) Where the question in subsection (1) relates to a right under a provision listed in subsection (1)(a), (c) or (d), this section does not confer power on the court to require the controller to carry out a search for information that is more extensive than the reasonable and proportionate search required by that provision.’”—(Sir John Whittingdale.)
This new clause makes provision about courts’ powers to require information to be provided to them, and to a data subject, when determining whether a data subject is entitled to information under certain provisions of the data protection legislation.
Brought up, read the First and Second time, and added to the Bill.
New Clause 10
Approval of a supplementary code
“(1) This section applies to a supplementary code whose content is for the time being determined by a person other than the Secretary of State.
(2) The Secretary of State must approve the supplementary code if—
(a) the code meets the conditions set out in the DVS trust framework (so far as relevant),
(b) an application for approval of the code is made which complies with any requirements imposed by a determination under section (Applications for approval and re-approval), and
(c) the applicant pays any fee required to be paid by a determination under section (Fees for approval, re-approval and continued approval)(1).
(3) The Secretary of State must notify an applicant in writing of the outcome of an application for approval.
(4) The Secretary of State may not otherwise approve a supplementary code.
(5) In this Part, an “approved supplementary code” means a supplementary code for the time being approved under this section.
(6) For when a code ceases (or may cease) to be approved under this section, see sections (Change to conditions for approval or designation), (Revision of a recognised supplementary code) and (Request for withdrawal of approval).”—(Sir John Whittingdale.)
This amendment sets out when a supplementary code of someone other than the Secretary of State must be approved by the Secretary of State.
Brought up, read the First and Second time, and added to the Bill.
New Clause 11
Designation of a supplementary code
“(1) This section applies to a supplementary code whose content is for the time being determined by the Secretary of State.
(2) If the Secretary of State determines that the supplementary code meets the conditions set out in the DVS trust framework (so far as relevant), the Secretary of State may designate the code as one which complies with the conditions.
(3) In this Part, a ‘designated supplementary code’ means a supplementary code for the time being designated under this section.
(4) For when a code ceases (or may cease) to be designated under this section, see sections (Change to conditions for approval or designation), (Revision of a recognised supplementary code) and (Removal of designation).”—(Sir John Whittingdale.)
This enables the Secretary of State to designate a supplementary code of the Secretary of State as one which complies with the conditions set out in the DVS trust framework.
Brought up, read the First and Second time, and added to the Bill.
New Clause 12
List of recognised supplementary codes
“(1) The Secretary of State must—
(a) maintain a list of recognised supplementary codes, and
(b) make the list publicly available.
(2) For the purposes of this Part, each of the following is a ‘recognised supplementary code’—
(a) an approved supplementary code, and
(b) a designated supplementary code.”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to publish, and keep up to date, a list of supplementary codes that are designated or approved.
Brought up, read the First and Second time, and added to the Bill.
New Clause 13
Change to conditions for approval or designation
“(1) This section applies if the Secretary of State revises the DVS trust framework so as to change the conditions which must be met for the approval or designation of a supplementary code.
(2) An approved supplementary code which is affected by the change ceases to be an approved supplementary code at the end of the relevant period unless an application for re-approval of the code is made within that period.
(3) Pending determination of an application for re-approval the supplementary code remains an approved supplementary code.
(4) Before the end of the relevant period the Secretary of State must—
(a) review each designated supplementary code which is affected by the change (if any), and
(b) determine whether it meets the conditions as changed.
(5) If, on a review under subsection (4), the Secretary of State determines that a designated supplementary code does not meet the conditions as changed, the code ceases to be a designated supplementary code at the end of the relevant period.
(6) A supplementary code is affected by a change if the change alters, or adds, a condition which is or would be relevant to the supplementary code when deciding whether to approve it under section (Approval of a supplementary code) or designate it under section (Designation of a supplementary code).
(7) In this section “the relevant period” means the period of 21 days beginning with the day on which the DVS trust framework containing the change referred to in subsection (1) comes into force.
(8) Section (Approval of a supplementary code) applies to re-approval of a supplementary code as it applies to approval of such a code.”—(Sir John Whittingdale.)
This amendment provides that when conditions for approval or designation are changed this requires re-approval of an approved supplementary code and, in the case of a designated supplementary code, a re-assessment of whether the code meets the revised conditions.
Brought up, read the First and Second time, and added to the Bill.
New Clause 14
Revision of a recognised supplementary code
“(1) If an approved supplementary code is revised—
(a) the code before and after the revision are treated as the same code for the purposes of this Part, and
(b) the code ceases to be an approved supplementary code unless subsection (2) or (4) applies.
(2) This subsection applies if the supplementary code, in its revised form, has been approved under section (Approval of a supplementary code).
(3) If subsection (2) applies the approved supplementary code, in its revised form, remains an approved supplementary code.
(4) This subsection applies for so long as—
(a) a decision is pending under section (Approval of a supplementary code) on an application for approval of the supplementary code in its revised form, and
(b) the revisions to the code have not taken effect.
(5) If subsection (4) applies the supplementary code, in its unrevised form, remains an approved supplementary code.
(6) The Secretary of State may revise a designated supplementary code only if the Secretary of State is satisfied that the code, in its revised form, meets the conditions set out in the DVS trust framework (so far as relevant).
(7) If a designated supplementary code is revised, the code before and after the revision are treated as the same code for the purposes of this Part.”—(Sir John Whittingdale.)
This amendment sets out the consequences where there are changes to a recognised supplementary code and, in particular, what needs to be done for the code to remain a recognised supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 15
Applications for approval and re-approval
“(1) The Secretary of State may determine—
(a) the form of an application for approval or re-approval under section (Approval of a supplementary code),
(b) the information to be contained in or provided with the application,
(c) the documents to be provided with the application,
(d) the manner in which the application is to be submitted, and
(e) who may make the application.
(2) A determination may make different provision for different purposes.
(3) The Secretary of State must publish a determination.
(4) The Secretary of State may revise a determination.
(5) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine the process for making a valid application for approval of a supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 16
Fees for approval, re-approval and continued approval
“(1) The Secretary of State may determine that a person who applies for approval or re-approval of a supplementary code under section (Approval of a supplementary code) must pay a fee to the Secretary of State of an amount specified in the determination.
(2) A determination under subsection (1) may specify an amount which exceeds the administrative costs of determining the application for approval or re-approval.
(3) The Secretary of State may determine that a fee is payable to the Secretary of State, of an amount and at times specified in the determination, in connection with the continued approval of a supplementary code.
(4) A determination under subsection (3)—
(a) may specify an amount which exceeds the administrative costs associated with the continued approval of a supplementary code, and
(b) must specify, or describe, who must pay the fee.
(5) A fee payable under subsection (3) is recoverable summarily (or, in Scotland, recoverable) as a civil debt.
(6) A determination may make different provision for different purposes.
(7) The Secretary of State must publish a determination.
(8) The Secretary of State may revise a determination.
(9) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine that a fee is payable for approval/re-approval/continued approval of a supplementary code and the amount of such a fee.
Brought up, read the First and Second time, and added to the Bill.
New Clause 17
Request for withdrawal of approval
“(1) The Secretary of State must withdraw approval of a supplementary code if—
(a) the Secretary of State receives a notice requesting the withdrawal of approval of the supplementary code, and
(b) the notice complies with any requirements imposed by a determination under subsection (3).
(2) Before the day on which the approval is withdrawn, the Secretary of State must inform the person who gave the notice of when it will be withdrawn.
(3) The Secretary of State may determine—
(a) the form of a notice,
(b) the information to be contained in or provided with the notice,
(c) the documents to be provided with the notice,
(d) the manner in which the notice is to be submitted,
(e) who may give the notice.
(4) A determination may make different provision for different purposes.
(5) The Secretary of State must publish a determination.
(6) The Secretary of State may revise a determination.
(7) If the Secretary of State revises a determination the Secretary of State must publish the determination as revised.”—(Sir John Whittingdale.)
This amendment enables a supplementary code to be “de-approved”, on request.
Brought up, read the First and Second time, and added to the Bill.
New Clause 18
Removal of designation
“(1) The Secretary of State may determine to remove the designation of a supplementary code.
(2) A determination must—
(a) be published, and
(b) specify when the designation is to be removed, which must be a time after the end of the period of 21 days beginning with the day on which the determination is published.”—(Sir John Whittingdale.)
This amendment enables the Secretary of State to determine that a designated supplementary code should cease to be designated.
Brought up, read the First and Second time, and added to the Bill.
New Clause 19
Registration of additional services
“(1) Subsection (2) applies if—
(a) a person is registered in the DVS register,
(b) the person applies for their entry in the register to be amended to record additional digital verification services that the person provides in accordance with the main code,
(c) the person holds a certificate from an accredited conformity assessment body certifying that the person provides the additional services in accordance with the main code,
(d) the application complies with any requirements imposed by a determination under section 51, and
(e) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must amend the DVS register to record that the person is also registered in respect of the additional services referred to in subsection (1).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) it is required to be ignored by reason of provision included in the DVS trust framework under 49(10).”—(Sir John Whittingdale.)
This amendment provides for a person to apply to add services to their entry in the DVS register and requires the Secretary of State to amend the register to record that a person is registered in respect of the additional services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 20
Supplementary notes
“(1) Subsection (2) applies if—
(a) a person holds a certificate from an accredited conformity assessment body certifying that digital verification services provided by the person are provided in accordance with a recognised supplementary code,
(b) the person applies for a note about one or more of the services to which the certificate relates to be included in the entry relating to that person in the DVS register,
(c) the application complies with any requirements imposed by a determination under section 51, and
(d) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must include a note in the entry relating to the person in the DVS register recording that the person provides, in accordance with the recognised supplementary code referred to in subsection (1), the services in respect of which the person made the application referred to in that subsection.
(3) The Secretary of State may not otherwise include a note described in subsection (2) in the DVS register.
(4) For the purposes of subsection (1)(a), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (5) applies.
(5) This subsection applies if—
(a) the recognised supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.
(6) In this Part, a note included in the DVS register in accordance with subsection (2) is referred to as a supplementary note.”—(Sir John Whittingdale.)
This amendment provides for a person to apply for a note to be included in the DVS register that they provide digital verification services in accordance with a recognised supplementary code.
Brought up, read the First and Second time, and added to the Bill.
New Clause 21
Addition of services to supplementary notes
“(1) Subsection (2) applies if—
(a) a person has a supplementary note included in the DVS register,
(b) the person applies for the note to be amended to record additional digital verification services that the person provides in accordance with a recognised supplementary code,
(c) the person holds a certificate from an accredited conformity assessment body certifying that the person provides the additional services in accordance with the recognised supplementary code referred to in paragraph (b),
(d) the application complies with any requirements imposed by a determination under section 51, and
(e) the person pays any fee required to be paid by a determination under section 52(1).
(2) The Secretary of State must amend the note to record that the person also provides the additional services referred to in subsection (1) in accordance with the recognised supplementary code referred to in that subsection.
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (4) applies.
(4) This subsection applies if—
(a) the recognised supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment provides for a person to add services to their supplementary note in the DVS register and requires the Secretary of State to amend the note to record that a person is registered in respect of the additional services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 22
Duty to remove services from the DVS register
“(1) Where a person is registered in the DVS register in respect of digital verification services, subsection (2) applies if the person—
(a) asks for the register to be amended so that the person is no longer registered in respect of one or more of those services,
(b) ceases to provide one or more of those services, or
(c) no longer holds a certificate from an accredited conformity assessment body certifying that all of those services are provided in accordance with the main code.
(2) The Secretary of State must amend the register to record that the person is no longer registered in respect of (as the case may be)—
(a) the service or services mentioned in a request described in subsection (1)(a),
(b) the service or services which the person has ceased to provide, or
(c) the service or services for which there is no longer a certificate as described in subsection (1)(c).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) it is required to be ignored by reason of provision included in the DVS trust framework under section 49(10).”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to amend the DVS register, in certain circumstances, to record that a person is no longer registered in respect of certain services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 23
Duty to remove supplementary notes from the DVS register
“(1) The Secretary of State must remove a supplementary note included in the entry in the DVS register relating to a person if—
(a) the person asks for the note to be removed,
(b) the person ceases to provide all of the digital verification services to which the note relates,
(c) the person no longer holds a certificate from an accredited conformity assessment body certifying that at least one of those digital verification services is provided in accordance with the supplementary code, or
(d) the person continues to hold a certificate described in paragraph (c) but the supplementary code is not a recognised supplementary code.
(2) For the purposes of subsection (1)(c) and (d), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (3) applies.
(3) This subsection applies if—
(a) the supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment sets out the circumstances in which the Secretary of State must remove a supplementary note from the DVS register.
Brought up, read the First and Second time, and added to the Bill.
New Clause 24
Duty to remove services from supplementary notes
“(1) Where a person has a supplementary note included in their entry in the DVS register in respect of digital verification services, subsection (2) applies if the person—
(a) asks for the register to be amended so that the note no longer records one or more of those services,
(b) ceases to provide one or more of the services recorded in the note, or
(c) no longer holds a certificate from an accredited conformity assessment body certifying that all of the services included in the note are provided in accordance with a supplementary code.
(2) The Secretary of State must amend the supplementary note so it no longer records (as the case maA24y be)—
(a) the service or services mentioned in a request described in subsection (1)(a),
(b) the service or services which the person has ceased to provide, or
(c) the service or services for which there is no longer a certificate as described in subsection (1)(c).
(3) For the purposes of subsection (1)(c), a certificate is to be ignored if—
(a) it has expired in accordance with its terms,
(b) it has been withdrawn by the body that issued it, or
(c) subsection (4) applies.
(4) This subsection applies if—
(a) the supplementary code to which the certificate relates has been revised since the certificate was issued,
(b) the certificate was issued before the revision to the supplementary code took effect, and
(c) the supplementary code (as revised) provides—
(i) that certificates issued before the time the revision takes effect are required to be ignored, or
(ii) that such certificates are to be ignored from a date, or from the end of a period, specified in the code and that date has passed or that period has elapsed.”—(Sir John Whittingdale.)
This amendment places the Secretary of State under a duty to amend a supplementary note on the DVS register relating to a person, in certain circumstances, to remove reference to certain services from the note.
Brought up, read the First and Second time, and added to the Bill.
New Clause 25
Index of defined terms for Part 2
“The Table below lists provisions that define or otherwise explain terms defined for the purposes of this Part of this Act.
—(Sir John Whittingdale.)
This amendment provides an index of terms which are defined in Part 2.
Brought up, read the First and Second time, and added to the Bill.
New Clause 26
Powers relating to verification of identity or status
“(1) In section 15 of the Immigration, Asylum and Nationality Act 2006 (penalty for employing a person subject to immigration control), after subsection (7) insert—
“(8) An order under subsection (3) containing provision described in subsection (7)(a), (b) or (c) may, in particular—
(a) specify a document generated by a DVS-registered person or a DVS-registered person of a specified description;
(b) specify a document which was provided to such a person in order to generate such a document;
(c) specify steps involving the use of services provided by such a person.
(9) In subsection (8), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(10) An order under subsection (3) which specifies a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to specified services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).”
(2) In section 34 of the Immigration Act 2014 (requirements which may be prescribed for the purposes of provisions about occupying premises under a residential tenancy agreement)—
(a) in subsection (1)—
(i) in paragraph (a), after “occupiers” insert “, a DVS-registered person or a DVS-registered person of a prescribed description”,
(ii) in paragraph (b), after “occupiers” insert “, a DVS-registered person or a DVS-registered person of a prescribed description”, and
(iii) in paragraph (c), at the end insert “, including steps involving the use of services provided by a DVS-registered person or a DVS-registered person of a prescribed description”, and
(b) after that subsection insert—
“(1A) An order prescribing requirements for the purposes of this Chapter which contains provision described in subsection (1)(a) or (b) may, in particular—
(a) prescribe a document generated by a DVS-registered person or a DVS-registered person of a prescribed description;
(b) prescribe a document which was provided to such a person in order to generate such a document.
(1B) In subsections (1) and (1A), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(1C) An order prescribing requirements for the purposes of this Chapter which prescribes a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to prescribed services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).”
(3) In Schedule 6 to the Immigration Act 2016 (illegal working compliance orders etc), after paragraph 5 insert—
“Prescribed checks and documents
5A (1) Regulations under paragraph 5(6)(b) or (c) may, in particular—
(a) prescribe checks carried out using services provided by a DVS-registered person or a DVS-registered person of a prescribed description;
(b) prescribe documents generated by such a person;
(c) prescribe documents which were provided to such a person in order to generate such documents.
(2) In sub-paragraph (1), “DVS-registered person” means a person who is registered in the DVS register maintained under Part 2 of the Data Protection and Digital Information Act 2024 (“the DVS register”).
(3) Regulations under paragraph 5(6)(b) or (c) which prescribe a description of DVS-registered person may do so by, for example, describing a DVS-registered person whose entry in the DVS register includes a note relating to prescribed services (see section (Supplementary notes) of the Data Protection and Digital Information Act 2024).””—(Sir John Whittingdale.)
This amendment contains amendments of powers to make subordinate legislation so they can be exercised so as to make provision by reference to persons registered in the DVS register established under Part 2 of the Bill.
Brought up, read the First and Second time, and added to the Bill.
New Clause 27
Interface bodies
“(1) This section is about the provision that regulations under section 66 or 68 may (among other things) contain about bodies with one or more of the following tasks—
(a) establishing a facility or service used, or capable of being used, for providing, publishing or otherwise processing customer data or business data or for taking action described in section 66(3) (an “interface”);
(b) setting standards (“interface standards”), or making other arrangements (“interface arrangements”), for use by other persons when establishing, maintaining or managing an interface;
(c) maintaining or managing an interface, interface standards or interface arrangements.
(2) Such bodies are referred to in this Part as “interface bodies”.
(3) The regulations may—
(a) require a data holder, an authorised person or a third party recipient to set up an interface body;
(b) make provision about the type of body to be set up.
(4) In relation to an interface body (whether or not it is required to be set up by regulations under section 66 or 68), the regulations may—
(a) make provision about the body’s composition and governance;
(b) make provision requiring a data holder, an authorised person or a third party recipient to provide, or arrange for, assistance for the body;
(c) impose other requirements relating to the body on a person required to set it up or to provide, or arrange for, assistance for the body;
(d) make provision requiring the body to carry on all or part of a task described in subsection (1);
(e) make provision requiring the body to do other things in connection with its interface, interface standards or interface arrangements;
(f) make provision about how the body carries out its functions (such as, for example, provision about the body’s objectives or matters to be taken into account by the body);
(g) confer powers on the body for the purpose of monitoring use of its interface, interface standards or interface arrangements (“monitoring powers”) (and see section 71 for provision about enforcement of requirements imposed in exercise of those powers);
(h) make provision for the body to arrange for its monitoring powers to be exercised by another person;
(i) make provision about the rights of persons affected by the exercise of the body’s functions under the regulations, including (among other things)—
(i) provision about the review of decisions made in exercise of those functions;
(ii) provision about appeals to a court or tribunal;
(j) make provision about complaints, including provision requiring the body to implement procedures for the handling of complaints;
(k) make provision enabling or requiring the body to publish, or provide to a specified person, specified documents or information relating to its interface, interface standards or interface arrangements;
(l) make provision enabling or requiring the body to produce guidance about how it proposes to exercise its functions under the regulations, to publish the guidance and to provide copies to specified persons.
(5) The monitoring powers that may be conferred on an interface body include power to require the provision of documents or information (but such powers are subject to the restrictions in section 72 as well as any restrictions included in the regulations).
(6) Examples of facilities or services referred to in subsection (1) include dashboard services, other electronic communications services and application programming interfaces.
(7) In subsection (4)(b) and (c), the references to assistance include actual or contingent financial assistance (such as, for example, a grant, loan, guarantee or indemnity or buying a company’s share capital).”—(Sir John Whittingdale.)
This new clause enables regulations under Part 3 to make provision about bodies providing facilities or services used for providing, publishing or processing customer data or business data, or setting standards or making other arrangements in connection with such facilities or services.
Brought up, read the First and Second time, and added to the Bill.
New Clause 28
The FCA and financial services interfaces
“(1) The Treasury may by regulations make provision enabling or requiring the Financial Conduct Authority (“the FCA”) to make rules—
(a) requiring financial services providers described in the regulations to use a prescribed interface, or prescribed interface standards or interface arrangements, when providing or receiving customer data or business data which is required to be provided by or to the financial services provider by data regulations;
(b) requiring persons described in the regulations to use a prescribed interface, or prescribed interface standards or interface arrangements, when the person, in the course of a business, receives, from a financial services provider, customer data or business data which is required to be provided to the person by data regulations;
(c) imposing interface-related requirements on a description of person falling within subsection (2),
and such rules are referred to in this Part as “FCA interface rules”.
(2) The following persons fall within this subsection—
(a) an interface body linked to the financial services sector on which requirements are imposed by regulations made in reliance on section (Interface bodies);
(b) a person required by regulations made in reliance on section (Interface bodies) to set up an interface body linked to the financial services sector;
(c) a person who uses an interface, interface standards or interface arrangements linked to the financial services sector or who is required to do so by data regulations or rules made by virtue of regulations under subsection (1)(a) or (b).
(3) For the purposes of this section, requirements are interface-related if they relate to—
(a) the composition, governance or activities of an interface body linked to the financial services sector,
(b) an interface, interface standards or interface arrangements linked to the financial services sector, or
(c) the use of such an interface, such interface standards or such interface arrangements.
(4) For the purposes of this section—
(a) an interface body is linked to the financial services sector to the extent that its interface, interface standards or interface arrangements are linked to the financial service sector;
(b) interfaces, interface standards and interface arrangements are linked to the financial services sector to the extent that they are used, or intended to be used, by financial services providers (whether or not they are used, or intended to be used, by other persons).
(5) The Treasury may by regulations make provision enabling or requiring the FCA to impose requirements on a person to whom FCA interface rules apply (referred to in this Part as “FCA additional requirements”) where the FCA considers it appropriate to impose the requirement—
(a) in response to a failure, or likely failure, by the person to comply with an FCA interface rule or FCA additional requirement, or
(b) in order to advance a purpose which the FCA is required to advance when exercising functions conferred by regulations under this section (see section (The FCA and financial services interfaces: supplementary)(3)(a)).
(6) Regulations under subsection (5) may, for example, provide for the FCA to impose requirements by giving a notice or direction.
(7) The restrictions in section 72 apply in connection with FCA interface rules and FCA additional requirements as they apply in connection with regulations under this Part.
(8) In section 72 as so applied—
(a) the references in subsections (1)(b) and (8) to an enforcer include the FCA, and
(b) the references in subsections (3) and (4) to data regulations include FCA interface rules and FCA additional requirements.
(9) In this section—
“financial services provider” means a person providing financial services;
“prescribed” means prescribed in FCA interface rules.”—(Sir John Whittingdale.)
This new clause and new clause NC29 enable the Treasury, by regulations, to confer powers on the Financial Conduct Authority to impose requirements (by means of rules or otherwise) on interface bodies used by the financial services sector and on persons participating in, or using facilities and services provided by, such bodies.
Brought up, read the First and Second time, and added to the Bill.
New Clause 29
The FCA and financial services interfaces: supplementary
“(1) This section is about provision that regulations under section (The FCA and financial services interfaces) may or must (among other things) contain.
(2) The regulations—
(a) may enable or require the FCA to impose interface-related requirements that could be imposed by regulations made in reliance on section (Interface bodies)(4) or (5), but
(b) may not enable or require the FCA to require a person to set up an interface body.
(3) The regulations must—
(a) require the FCA, so far as is reasonably possible, to exercise functions conferred by the regulations in a manner which is compatible with, or which advances, one or more specified purposes;
(b) specify one or more matters to which the FCA must have regard when exercising functions conferred by the regulations;
(c) if they enable or require the FCA to make rules, make provision about the procedure for making rules, including provision requiring such consultation with persons likely to be affected by the rules or representatives of such persons as the FCA considers appropriate.
(4) The regulations may—
(a) require the FCA to carry out an analysis of the costs and benefits that will arise if proposed rules are made or proposed changes are made to rules and make provision about what the analysis must include;
(b) require the FCA to publish rules or changes to rules and to provide copies to specified persons;
(c) make provision about the effect of rules, including provision about circumstances in which rules are void and circumstances in which a person is not to be taken to have contravened a rule;
(d) make provision enabling or requiring the FCA to modify or waive rules as they apply to a particular case;
(e) make provision about the procedure for imposing FCA additional requirements;
(f) make provision enabling or requiring the FCA to produce guidance about how it proposes to exercise its functions under the regulations, to publish the guidance and to provide copies to specified persons.
(5) The regulations may enable or require the FCA to impose the following types of requirement on a person as FCA additional requirements—
(a) a requirement to review the person’s conduct;
(b) a requirement to take remedial action;
(c) a requirement to make redress for loss or damage suffered by others as a result of the person’s conduct.
(6) The regulations may enable or require the FCA to make rules requiring a person falling within section (The FCA and financial services interfaces)(2)(b) or (c) to pay fees to an interface body for the purpose of meeting expenses incurred, or to be incurred, by such a body in performing duties, or exercising powers, imposed or conferred by regulations under this Part or by rules made by virtue of regulations under section (The FCA and financial services interfaces).
(7) Regulations made in reliance on subsection (6)—
(a) may enable rules to provide for the amount of a fee to be an amount which is intended to exceed the cost of the things in respect of which the fee is charged;
(b) must require rules to provide for the amount of a fee to be—
(i) a prescribed amount or an amount determined in accordance with the rules, or
(ii) an amount not exceeding such an amount;
(c) may enable or require rules to provide for the amount, or maximum amount, of a fee to increase at specified times and by—
(i) a prescribed amount or an amount determined in accordance with the rules, or
(ii) an amount not exceeding such an amount;
(d) if they enable rules to enable a person to determine an amount, must require rules to require the person to publish information about the amount and how it is determined;
(e) may enable or require rules to make provision about—
(i) interest on any unpaid amounts;
(ii) the recovery of unpaid amounts.
(8) In this section—
“interface-related” has the meaning given in section (The FCA and financial services interfaces);
“prescribed” means prescribed in FCA interface rules.
(9) The reference in subsection (5)(c) to making redress includes—
(a) paying interest, and
(b) providing redress in the form of a remedy or relief which could not be awarded in legal proceedings.”—(Sir John Whittingdale.)
See the explanatory statement for new clause NC28.
Brought up, read the First and Second time, and added to the Bill.
New Clause 30
The FCA and financial services interfaces: penalties and levies
“(1) Subsections (2) and (3) are about the provision that regulations made by the Treasury under this Part providing for the FCA to enforce requirements under FCA interface rules may (among other things) contain in relation to financial penalties.
(2) The regulations may require or enable the FCA—
(a) to set the amount or maximum amount of, or of an increase in, a penalty imposed in respect of failure to comply with a requirement imposed by the FCA in exercise of a power conferred by regulations under section (The FCA and financial services interfaces) (whether imposed by means of FCA interface rules or an FCA additional requirement), or
(b) to set the method for determining such an amount.
(3) Regulations made in reliance on subsection (2)—
(a) must require the FCA to produce and publish a statement of its policy with respect to the amount of the penalties;
(b) may require the policy to include specified matters;
(c) may make provision about the procedure for producing the statement;
(d) may require copies of the statement to be provided to specified persons;
(e) may require the FCA to have regard to a statement published in accordance with the regulations.
(4) The Treasury may by regulations—
(a) impose, or provide for the FCA to impose, a levy on data holders, authorised persons or third party recipients for the purpose of meeting all or part of the expenses incurred, or to be incurred, during a period by the FCA, or by a person acting on the FCA’s behalf, in performing duties, or exercising powers, imposed or conferred on the FCA by regulations under section (The FCA and financial services interfaces), and
(b) make provision about how funds raised by means of the levy must or may be used.
(5) Regulations under subsection (4) may only provide for a levy in respect of expenses of the FCA to be imposed on persons that appear to the Treasury to be capable of being directly affected by the exercise of some or all of the functions conferred on the FCA by regulations under section (The FCA and financial services interfaces).
(6) Section 75(3) and (4) apply in relation to regulations under subsection (4) of this section as they apply in relation to regulations under section 75(1).”—(Sir John Whittingdale.)
This new clause enables the Treasury, by regulations, to confer power on the Financial Conduct Authority to set the amount of certain penalties. It also enables the Treasury to impose a levy in respect of expenses incurred by that Authority.
Brought up, read the First and Second time, and added to the Bill.
New Clause 31
Liability in damages
“(1) The Secretary of State or the Treasury may by regulations provide that a person listed in subsection (2) is not liable in damages for anything done or omitted to be done in the exercise of functions conferred by regulations under this Part.
(2) Those persons are—
(a) a public authority;
(b) a member, officer or member of staff of a public authority;
(c) a person who could be held vicariously liable for things done or omitted by a public authority.
(3) Regulations under this section may not—
(a) make provision removing liability for an act or omission which is shown to have been in bad faith, or
(b) make provision so as to prevent an award of damages made in respect of an act or omission on the ground that the act or omission was unlawful as a result of section 6(1) of the Human Rights Act 1998.”— (Sir John Whittingdale.)
This new clause enables regulations under Part 3 to provide that certain persons are not liable in damages when exercising functions under such regulations.
Brought up, read the First and Second time, and added to the Bill.
New Clause 32
Other data provision
“(1) This section is about cases in which subordinate legislation other than regulations under this Part contains provision described in section 66(1) to (3) or 68(1) to (2A) (“other data provision”).
(2) The regulation-making powers under this Part may be exercised so as to make, in connection with the other data provision, any provision that they could be exercised to make as part of, or in connection with, provision made under section 66(1) to (3) or 68(1) to (2A) that is equivalent to the other data provision.
(3) In this Part, references to “data regulations” include regulations made in reliance on subsection (2) to the extent that they make provision described in sections 66 to 70 or (Interface bodies).
(4) In this section, “subordinate legislation” has the same meaning as in the Interpretation Act 1978 (see section 21 of that Act).”—(Sir John Whittingdale.)
This new clause enables the regulation-making powers under Part 3 to be used to supplement existing subordinate legislation which requires customer data or business data to be provided to customers and others.
Brought up, read the First and Second time, and added to the Bill.
New Clause 33
Duty to notify the Commissioner of personal data breach: time periods
“(1) In regulation 5A of the PEC Regulations (personal data breach)—
(a) in paragraph (2), after “delay” insert “and, where feasible, not later than 72 hours after having become aware of it”, and
(b) after paragraph (3) insert—
“(3A) Where notification under paragraph (2) is not made within 72 hours, it must be accompanied by reasons for the delay.”
(2) In Article 2 of Commission Regulation (EU) No 611/2013 of 24 June 2013 on the measures applicable to the notification of personal data breaches under Directive 2002/58/EC of the European Parliament and of the Council on privacy and electronic communications (notification to the Information Commissioner)—
(a) in paragraph 2—
(i) in the first subparagraph, for the words from “no” to “feasible” substitute “without undue delay and, where feasible, not later than 72 hours after having becoming aware of it”, and
(ii) in the second subparagraph, after “shall” insert “, subject to paragraph 3,”, and
(b) for paragraph 3 substitute—
“3. To the extent that the information set out in Annex 1 is not available to be included in the notification, it may be provided in phases without undue further delay.””—(Sir John Whittingdale.)
This adjusts the period within which the Information Commissioner must be notified of a personal data breach. It also inserts a duty (into the PEC Regulations) to give reasons for not notifying within 72 hours and adjusts the duty (in Commission Regulation (EU) No 611/2013) to provide accompanying information.
Brought up, read the First and Second time, and added to the Bill.
New Clause 34
Power to require information for social security purposes
“In Schedule (Power to require information for social security purposes)—
(a) Part 1 amends the Social Security Administration Act 1992 to make provision about a power for the Secretary of State to obtain information for social security purposes;
(b) Part 2 amends the Social Security Administration (Northern Ireland) Act 1992 to make provision about a power for the Department for Communities to obtain information for such purposes;
(c) Part 3 makes related amendments of the Proceeds of Crime Act 2002.”—(Sir John Whittingdale.)
This new clause introduces a new Schedule NS1 which amends social security legislation to make provision about a new power for the Secretary of State or, in Northern Ireland, the Department for Communities, to obtain information for social security purposes.
Brought up, read the First and Second time, and added to the Bill.
New Clause 35
Retention of information by providers of internet services in connection with death of child
“(1) The Online Safety Act 2023 is amended as follows.
(2) In section 100 (power to require information)—
(a) omit subsection (7);
(b) after subsection (8) insert—
“(8A) The power to give a notice conferred by subsection (1) does not include power to require processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, the duty imposed by the notice is to be taken into account).”
(3) In section 101 (information in connection with investigation into death of child)—
(a) before subsection (1) insert—
“(A1) Subsection (D1) applies if a senior coroner (in England and Wales), a procurator fiscal (in Scotland) or a coroner (in Northern Ireland) (“the investigating authority”)—
(a) notifies OFCOM that—
(i) they are conducting an investigation, or are due to conduct an investigation, in connection with the death of a child, and
(ii) they suspect that the child may have taken their own life, and
(b) provides OFCOM with the details in subsection (B1).
(B1) The details are—
(a) the name of the child who has died,
(b) the child’s date of birth,
(c) any email addresses used by the child (so far as the investigating authority knows), and
(d) if any regulated service has been brought to the attention of the investigating authority as being of interest in connection with the child’s death, the name of the service.
(C1) Where this subsection applies, OFCOM—
(a) must give a notice to the provider of a service within subsection (E1) requiring the provider to ensure the retention of information relating to the use of the service by the child who has died, and
(b) may give a notice to any other relevant person requiring the person to ensure the retention of information relating to the use of a service within subsection (E1) by that child.
(D1) The references in subsection (C1) to ensuring the retention of information relating to the child’s use of a service include taking all reasonable steps, without delay, to prevent the deletion of such information by the routine operation of systems or processes.
(E1) A service is within this subsection if it is—
(a) a regulated service of a kind described in regulations made by the Secretary of State, or
(b) a regulated service notified to OFCOM by the investigating authority as described in subsection (B1)(d).
(F1) A notice under subsection (C1) may require information described in that subsection to be retained only if it is information—
(a) of a kind which OFCOM have power to require under a notice under subsection (1) (see, in particular, subsection (2)(a) to (d)), or
(b) which a person might need to retain to enable the person to provide information in response to a notice under subsection (1) (if such a notice were given).
(G1) OFCOM must share with the investigating authority any information they receive in response to requirements mentioned in section 102(5A)(d) that are included in a notice under subsection (C1).”
(b) in subsection (3), for “power conferred by subsection (1) includes” substitute “powers conferred by this section include”;
(c) after subsection (5) insert—
“(5A) The powers to give a notice conferred by this section do not include power to require processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, the duty imposed by the notice is to be taken into account).”
(4) In section 102 (information notices)—
(a) in subsection (1), for “101(1)” substitute “101(C1) or (1)”;
(b) in subsection (3)—
(i) after “information notice” insert “under section 100(1) or 101(1)”,
(ii) omit “and” at the end of paragraph (c), and
(iii) after paragraph (c) insert—
“(ca) specify when the information must be provided (which may be on or by a specified date, within a specified period, or at specified intervals), and”;
(c) omit subsection (4);
(d) after subsection (5) insert—
“(5A) An information notice under section 101(C1) must—
(a) specify or describe the information to be retained,
(b) specify why OFCOM require the information to be retained,
(c) require the information to be retained for the period of one year beginning with the date of the notice,
(d) require the person to whom the notice is given—
(i) if the child to whom the notice relates used the service in question, to notify OFCOM by a specified date of steps taken to ensure the retention of information;
(ii) if the child did not use the service, or the person does not hold any information of the kind required, to notify OFCOM of that fact by a specified date, and
(e) contain information about the consequences of not complying with the notice.
(5B) If OFCOM give an information notice to a person under section 101(C1), they may, in response to information received from the investigating authority, extend the period for which the person is required to retain information by a maximum period of six months.
(5C) The power conferred by subsection (5B) is exercisable—
(a) by giving the person a notice varying the notice under section 101(C1) and stating the further period for which information must be retained and the reason for the extension;
(b) any number of times.”;
(e) after subsection (9) insert—
“(9A) OFCOM must cancel an information notice under section 101(C1) by notice to the person to whom it was given if advised by the investigating authority that the information in question no longer needs to be retained.”
(f) in subsection (10), after the definition of “information” insert—
““the investigating authority” has the same meaning as in section 101;”.
(5) In section 109 (offences in connection with information notices)—
(a) in subsection (2)(b), for “all reasonable steps” substitute “all of the steps that it was reasonable, and reasonably practicable, to take”;
(b) after subsection (6) insert—
“(6A) A person who is given an information notice under section 101(C1) commits an offence if—
(a) the person deletes or alters, or causes or permits the deletion or alteration of, any information required by the notice to be retained, and
(b) the person’s intention was to prevent the information being available, or (as the case may be) to prevent it being available in unaltered form, for the purposes of any official investigation into the death of the child to whom the notice relates.
(6B) For the purposes of subsection (6A) information has been deleted if it is irrecoverable (however that occurred).”
(6) In section 110 (senior managers’ liability: information offences)—
(a) after subsection (6) insert—
“(6A) An individual named as a senior manager of an entity commits an offence if—
(a) the entity commits an offence under section 109(6A) (deletion etc of information), and
(b) the individual has failed to take all reasonable steps to prevent that offence being committed.”;
(b) in subsection (7), for “or (6)” substitute “, (6) or (6A)”.
(7) In section 113 (penalties for information offences), in subsection (2)—
(a) for “(4) or (5)” substitute “(4), (5) or (6A)”;
(b) for “(5) or (6)” substitute “(5), (6) or (6A)”.
(8) In section 114 (co-operation and disclosure of information: overseas regulators), in subsection (7), omit the definition of “the data protection legislation”.
(9) In section 225 (Parliamentary procedure for regulations), in subsection (10), after paragraph (c) insert—
“(ca) regulations under section 101(E1)(a),”
(10) In section 236(1) (interpretation)—
(a) after the definition of “country” insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3(9) of that Act);”;
(b) in the definition of “information notice”, for “101(1)” substitute “101(C1) or (1)”.
(11) In section 237 (index of defined terms), after the entry for “CSEA content” insert—
—(Sir John Whittingdale.)
This new clause amends the Online Safety Act 2023 to enable OFCOM to give internet service providers a notice requiring them to retain information in connection with an investigation by a coroner (or, in Scotland, procurator fiscal) into the death of a child suspected to have taken their own life. The new clause also creates related offences.
Brought up, read the First and Second time, and added to the Bill.
New Clause 36
Retention of biometric data and recordable offences
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (10).
(2) In section 18A(3) (retention of material: general), after “recordable offence” insert “or recordable-equivalent offence”.
(3) Section 18E (supplementary provision) is amended in accordance with subsections (4) to (10).
(4) In subsection (1), after the definition of “recordable offence” insert—
““recordable-equivalent offence” means an offence under the law of a country or territory outside England and Wales and Northern Ireland where the act constituting the offence would constitute a recordable offence if done in England and Wales or Northern Ireland (whether or not the act constituted such an offence when the person was convicted);”.
(5) In subsection (3), in the words before paragraph (a), after “offence” insert “in England and Wales or Northern Ireland”.
(6) After subsection (5) insert—
“(5A) For the purposes of section 18A, a person is to be treated as having been convicted of an offence in a country or territory outside England and Wales and Northern Ireland if, in respect of such an offence, a court exercising jurisdiction under the law of that country or territory has made a finding equivalent to—
(a) a finding that the person is not guilty by reason of insanity, or
(b) a finding that the person is under a disability and did the act charged against the person in respect of the offence.”
(7) In subsection (6)(a)—
(a) after “convicted” insert “—
(i) ‘”, and
(b) after “offence,” insert “or
(ii) in a country or territory outside England and Wales and Northern Ireland, of a recordable-equivalent offence,”.
(8) In subsection (6)(b)—
(a) omit “of a recordable offence”, and
(b) for “a recordable offence, other than a qualifying offence” substitute “an offence, other than a qualifying offence or qualifying-equivalent offence”.
(9) In subsection (7), for “subsection (6)” substitute “this section”.
(10) After subsection (7) insert—
“(7A) In subsection (6), “qualifying-equivalent offence” means an offence under the law of a country or territory outside England and Wales and Northern Ireland where the act constituting the offence would constitute a qualifying offence if done in England and Wales or Northern Ireland (whether or not the act constituted such an offence when the person was convicted).”
(11) The amendments made by this section apply only in connection with the retention of section 18 material that is or was obtained or acquired by a law enforcement authority—
(a) on or after the commencement day, or
(b) in the period of 3 years ending immediately before the commencement day.
(12) Subsection (13) of this section applies where—
(a) at the beginning of the commencement day, a law enforcement authority has section 18 material which it obtained or acquired in the period of 3 years ending immediately before the commencement day,
(b) at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material, and
(c) at the pre-commencement time, the law enforcement authority could have retained the material under section 18A of the Counter-Terrorism Act 2008, as it has effect taking account of the amendments made by subsections (2) to (10) of this section, if those amendments had been in force.
(13) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(14) In this section—
“the commencement day” means the day on which this Act is passed;
“law enforcement authority” has the meaning given by section 18E(1) of the Counter-Terrorism Act 2008;
“section 18 material” has the meaning given by section 18(2) of that Act.
(15) For the purposes of this section, proceedings in relation to an offence are instituted—
(a) in England and Wales, when they are instituted for the purposes of Part 1 of the Prosecution of Offences Act 1985 (see section 15(2) of that Act);
(b) in Northern Ireland, when they are instituted for the purposes of Part 2 of the Justice (Northern Ireland) Act 2002 (see section 44(1) and (2) of that Act);
(c) in Scotland, when they are instituted for the purposes of Part 3 of the Proceeds of Crime Act 2002 (see section 151(1) and (2) of that Act).”—(Sir John Whittingdale.)
This new clause enables a law enforcement authority to retain fingerprints and DNA profiles where a person has been convicted of an offence equivalent to a recordable offence in a jurisdiction outside England and Wales and Northern Ireland.
Brought up, read the First and Second time, and added to the Bill.
New Clause 37
Retention of pseudonymised biometric data
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (6).
(2) Section 18A (retention of material: general) is amended in accordance with subsections (3) to (5).
(3) In subsection (1), for “subsection (5)” substitute “subsections (4) to (9)”.
(4) In subsection (4)(a), after “relates” insert “(a “pseudonymised form”)”.
(5) After subsection (6) insert—
“(7) Section 18 material which is not a DNA sample may be retained indefinitely by a law enforcement authority if—
(a) the authority obtains or acquires the material directly or indirectly from an overseas law enforcement authority,
(b) the authority obtains or acquires the material in a form which includes information which identifies the person to whom the material relates,
(c) as soon as reasonably practicable after obtaining or acquiring the material, the authority takes the steps necessary for it to hold the material in a pseudonymised form, and
(d) having taken those steps, the law enforcement authority continues to hold the material in a pseudonymised form.
(8) In a case where section 18 material is being retained by a law enforcement authority under subsection (7), if—
(a) the law enforcement authority ceases to hold the material in a pseudonymised form, and
(b) the material relates to a person who has no previous convictions or only one exempt conviction,
the material may be retained by the law enforcement authority until the end of the retention period specified in subsection (9).
(9) The retention period is the period of 3 years beginning with the date on which the law enforcement authority first ceases to hold the material in a pseudonymised form.”
(6) In section 18E(1) (supplementary provision)—
(a) in the definition of “law enforcement authority”, for paragraph (d) substitute—
“(d) an overseas law enforcement authority;”, and
(b) after that definition insert—
““overseas law enforcement authority” means a person formed or existing under the law of a country or territory outside the United Kingdom so far as exercising functions which—
(a) correspond to those of a police force, or
(b) otherwise involve the investigation or prosecution of offences;”.
(7) The amendments made by this section apply only in connection with the retention of section 18 material that is or was obtained or acquired by a law enforcement authority—
(a) on or after the commencement day, or
(b) in the period of 3 years ending immediately before the commencement day.
(8) Subsections (9) to (12) of this section apply where, at the beginning of the commencement day, a law enforcement authority has section 18 material which it obtained or acquired in the period of 3 years ending immediately before the commencement day.
(9) Where the law enforcement authority holds the material in a pseudonymised form at the beginning of the commencement day, the authority is to be treated for the purposes of section 18A(7)(c) and (d) of the Counter-Terrorism Act 2008 as having—
(a) taken the steps necessary for it to hold the material in a pseudonymised form as soon as reasonably practicable after obtaining or acquiring the material, and
(b) continued to hold the material in a pseudonymised form until the commencement day.
(10) Where the law enforcement authority does not hold the material in a pseudonymised form at the beginning of the commencement day, the authority is to be treated for the purposes of section 18A(7)(c) of the Counter-Terrorism Act 2008 as taking the steps necessary for it to hold the material in a pseudonymised form as soon as reasonably practicable after obtaining or acquiring the material if it takes those steps on, or as soon as reasonably practicable after, the commencement day.
(11) Subsection (12) of this section applies where, at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material but—
(a) at the pre-commencement time, the law enforcement authority could have retained the material under section 18A(7) to (9) of the Counter-Terrorism Act 2008 (as inserted by this section) if those provisions had been in force, or
(b) on or after the commencement day, the law enforcement authority may retain the material under those provisions by virtue of subsection (9) or (10) of this section.
(12) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(13) In this section—
“the commencement day” , “law enforcement authority” and “section 18 material” have the meaning given in section (Retention of biometric data and recordable offences)(14);
“instituted” , in relation to proceedings, has the meaning given in section (Retention of biometric data and recordable offences)(15);
“in a pseudonymised form” has the meaning given by section 18A(4) and (10) of the Counter-Terrorism Act 2008 (as amended or inserted by this section).”—(Sir John Whittingdale.)
This new clause enables a law enforcement authority to retain fingerprints and DNA profiles where, as soon as reasonably practicable after acquiring or obtaining them, the authority takes the steps necessary for it to hold the material in a form which does not include information which identifies the person to whom the material relates.
Brought up, read the First and Second time, and added to the Bill.
New Clause 38
Retention of biometric data from INTERPOL
“(1) Part 1 of the Counter-Terrorism Act 2008 (powers to gather and share information) is amended in accordance with subsections (2) to (4).
(2) In section 18(4) (destruction of national security material not subject to existing statutory restrictions), after “18A” insert “, 18AA”.
(3) After section 18A insert—
“18AA Retention of material from INTERPOL
(1) This section applies to section 18 material which is not a DNA sample where the law enforcement authority obtained or acquired the material as part of a request for assistance, or a notification of a threat, sent to the United Kingdom via INTERPOL’s systems.
(2) The law enforcement authority may retain the material until the National Central Bureau informs the authority that the request or notification has been cancelled or withdrawn.
(3) If the law enforcement authority is the National Central Bureau, it may retain the material until it becomes aware that the request or notification has been cancelled or withdrawn.
(4) In this section—
“INTERPOL” means the organisation called the International Criminal Police Organization - INTERPOL;
“the National Central Bureau” means the body appointed for the time being in accordance with INTERPOL’s constitution to serve as the United Kingdom’s National Central Bureau.
(5) The reference in subsection (1) to material obtained or acquired as part of a request or notification includes material obtained or acquired as part of a communication, sent to the United Kingdom via INTERPOL’s systems, correcting, updating or otherwise supplementing the request or notification.
18AB Retention of material from INTERPOL: supplementary
(1) The Secretary of State may by regulations amend section 18AA to make such changes as the Secretary of State considers appropriate in consequence of—
(a) changes to the name of the organisation which, when section 18AA was enacted, was called the International Criminal Police Organization - INTERPOL (“the organisation”),
(b) changes to arrangements made by the organisation which involve fingerprints or DNA profiles being provided to members of the organisation (whether changes to existing arrangements or changes putting in place new arrangements), or
(c) changes to the organisation’s arrangements for liaison between the organisation and its members or between its members.
(2) Regulations under this section are subject to affirmative resolution procedure.”
(4) In section 18BA(5)(a) (retention of further fingerprints), after “18A” insert “, 18AA”.
(5) Section 18AA of the Counter-Terrorism Act 2008 applies in relation to section 18 material obtained or acquired by a law enforcement authority before the commencement day (as well as material obtained or acquired on or after that day), except where the law enforcement authority was informed, or became aware, as described in subsection (2) or (3) of that section before the commencement day.
(6) Subsection (7) of this section applies where—
(a) at the beginning of the commencement day, a law enforcement authority has section 18 material,
(b) at a time before the commencement day (a “pre-commencement time”), the law enforcement authority was required by section 18(4) of the Counter-Terrorism Act 2008 to destroy the material, but
(c) at the pre-commencement time, the law enforcement authority could have retained the material under section 18AA of that Act (as inserted by this section) if it had been in force.
(7) Where this subsection applies—
(a) the law enforcement authority is to be treated as not having been required to destroy the material at the pre-commencement time, but
(b) the material may not be used in evidence against the person to whom the material relates—
(i) in criminal proceedings in England and Wales, Northern Ireland or Scotland in relation to an offence where those proceedings, or other criminal proceedings in relation to the person and the offence, were instituted before the commencement day, or
(ii) in criminal proceedings in any other country or territory.
(8) In this section—
“the commencement day” , “law enforcement authority” and “section 18 material” have the meaning given in section (Retention of biometric data and recordable offences)(14);
“instituted” , in relation to proceedings, has the meaning given in section (Retention of biometric data and recordable offences)(15).”—(Sir John Whittingdale.)
This new clause enables fingerprints and DNA profiles obtained as part of a request for assistance, or notification of a threat, from INTERPOL and held for national security purposes by a law enforcement authority to be retained until the authority is informed that the request or notification has been withdrawn or cancelled.
Brought up, read the First and Second time, and added to the Bill.
New Clause 39
National Underground Asset Register
“(1) After section 106 of the New Roads and Street Works Act 1991 insert—
“Part 3A
National Underground Asset Register: England and Wales
The register
106A National Underground Asset Register
(1) The Secretary of State must keep a register of information relating to apparatus in streets in England and Wales.
(2) The register is to be known as the National Underground Asset Register (and is referred to in this Act as “NUAR”).
(3) NUAR must be kept in such form and manner as may be prescribed.
(4) The Secretary of State must make arrangements so as to enable any person who is required, by a provision of Part 3, to enter information into NUAR to have access to NUAR for that purpose.
(5) Regulations under subsection (3) are subject to the negative procedure.
106B Access to information kept in NUAR
(1) The Secretary of State may by regulations make provision in connection with making information kept in NUAR available—
(a) under a licence, or
(b) without a licence.
(2) The regulations may (among other things)—
(a) make provision about which information, or descriptions of information, may be made available;
(b) make provision about the descriptions of person to whom information may be made available;
(c) make provision for information to be made available subject to exceptions;
(d) make provision requiring or authorising the Secretary of State to adapt, modify or obscure information before making it available;
(e) make provision authorising all information kept in NUAR to be made available to prescribed descriptions of person under prescribed conditions;
(f) make provision about the purposes for which information may be made available;
(g) make provision about the form and manner in which information may be made available.
(3) The regulations may make provision about licences under which information kept in NUAR is made available, including—
(a) provision about the form of a licence;
(b) provision about the terms and conditions of a licence;
(c) provision for information to be made available under a licence for free or for a fee;
(d) provision about the amount of the fees, including provision for the amount of a fee to be an amount which is intended to exceed the cost of the things in respect of which the fee is charged;
(e) provision about how funds raised by means of fees must or may be used, including provision for funds to be paid to persons who are required, by a provision of Part 3, to enter information into NUAR.
(4) Except as otherwise prescribed and subject to section 106G, processing of information by the Secretary of State in exercise of functions conferred by or under section 106A or this section does not breach—
(a) any obligation of confidence owed by the Secretary of State, or
(b) any other restriction on the processing of information (however imposed).
(5) Regulations under this section are subject to the affirmative procedure.
Requirements for undertakers to pay fees and provide information
106C Fees payable by undertakers in relation to NUAR
(1) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to pay fees to the Secretary of State for or in connection with the exercise by the Secretary of State of any function conferred by or under this Part.
(2) The regulations may—
(a) specify the amounts of the fees, or the maximum amounts of the fees, or
(b) provide for the amounts of the fees, or the maximum amounts of the fees, to be determined in accordance with the regulations.
(3) In making the regulations the Secretary of State must seek to secure that, so far as possible and taking one year with another, the income from fees matches the expenses incurred by the Secretary of State in, or in connection with, exercising functions conferred by or under this Part (including expenses not directly connected with the keeping of NUAR).
(4) Except where the regulations specify the amounts of the fees—
(a) the amounts of the fees must be specified by the Secretary of State in a statement, and
(b) the Secretary of State must—
(i) publish the statement, and
(ii) lay it before Parliament.
(5) Regulations under subsection (1) may make provision about—
(a) when a fee is to be paid;
(b) the manner in which a fee is to be paid;
(c) the payment of discounted fees;
(d) exceptions to requirements to pay fees;
(e) the refund of all or part of a fee which has been paid.
(6) Before making regulations under subsection (1) the Secretary of State must consult—
(a) such representatives of persons likely to be affected by the regulations as the Secretary of State considers appropriate, and
(b) such other persons as the Secretary of State considers appropriate.
(7) Subject to the following provisions of this section regulations under subsection (1) are subject to the affirmative procedure.
(8) Regulations under subsection (1) that only make provision of a kind mentioned in subsection (2) are subject to the negative procedure.
(9) But the first regulations under subsection (1) that make provision of a kind mentioned in subsection (2) are subject to the affirmative procedure.
106D Providing information for purposes of regulations under section 106C
(1) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to provide information to the Secretary of State for either or both of the following purposes—
(a) assisting the Secretary of State in determining the provision that it is appropriate for regulations under section 106C(1) or a statement under section 106C(4) to make;
(b) assisting the Secretary of State in determining whether it is appropriate to make changes to such provision.
(2) The Secretary of State may by regulations make provision requiring undertakers having apparatus in a street to provide information to the Secretary of State for either or both of the following purposes—
(a) ascertaining whether a fee is payable by a person under regulations under section 106C(1);
(b) working out the amount of a fee payable by a person.
(3) Regulations under subsection (1) or (2) may require an undertaker to notify the Secretary of State of any changes to information previously provided under the regulations.
(4) Regulations under subsection (1) or (2) may make provision about—
(a) when information is to be provided (which may be at prescribed intervals);
(b) the form and manner in which information is to be provided;
(c) exceptions to requirements to provide information.
(5) Regulations under subsection (1) or (2) are subject to the negative procedure.
Monetary penalties
106E Monetary penalties
Schedule 5A makes provision about the imposition of penalties in connection with requirements imposed by regulations under sections 106C(1) and 106D(1) and (2).
Exercise of functions by third party
106F Arrangements for third party to exercise functions
(1) The Secretary of State may make arrangements for a prescribed person to exercise a relevant function of the Secretary of State.
(2) More than one person may be prescribed.
(3) Arrangements under this section may—
(a) provide for the Secretary of State to make payments to the person, and
(b) make provision as to the circumstances in which any such payments are to be repaid to the Secretary of State.
(4) In the case of the exercise of a function by a person authorised by arrangements under this section to exercise that function, any reference in this Part or in regulations under this Part to the Secretary of State in connection with that function is to be read as a reference to that person.
(5) Arrangements under this section do not prevent the Secretary of State from exercising a function to which the arrangements relate.
(6) Except as otherwise prescribed and subject to section 106G, the disclosure of information between the Secretary of State and a person in connection with the person’s entering into arrangements under this section or exercise of functions to which such arrangements relate does not breach—
(a) any obligation of confidence owed by the person making the disclosure, or
(b) any other restriction on the disclosure of information (however imposed).
(7) Regulations under this section are subject to the affirmative procedure.
(8) In this section “relevant function” means any function of the Secretary of State conferred by or under this Part (including the function of charging or recovering fees under section 106C) other than—
(a) a power to make regulations, or
(b) a function under section 106C(4) (specifying of fees etc).
Data protection
106G Data protection
(1) A duty or power to process information that is imposed or conferred by or under this Part does not operate to require or authorise the processing of personal data that would contravene the data protection legislation (but in determining whether processing of personal data would do so, that duty or power is to be taken into account).
(2) In this section—
“the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3(9) of that Act);
“personal data” has the same meaning as in that Act (see section 3(2) of that Act).
Supplementary provisions
106H Regulations under this Part
(1) In this Part “prescribed” means prescribed by regulations made by the Secretary of State.
(2) Regulations under this Part may make—
(a) different provision for different purposes;
(b) supplementary and incidental provision.
(3) Regulations under this Part are to be made by statutory instrument.
(4) Before making regulations under this Part the Secretary of State must consult the Welsh Ministers.
(5) Where regulations under this Part are subject to “the affirmative procedure” the regulations may not be made unless a draft of the statutory instrument containing them has been laid before and approved by a resolution of each House of Parliament.
(6) Where regulations under this Part are subject to “the negative procedure” the statutory instrument containing the regulations is subject to annulment in pursuance of a resolution of either House of Parliament.
(7) Any provision that may be made in regulations under this Part subject to the negative procedure may be made in regulations subject to the affirmative procedure.
106I Interpretation
(1) In this Part the following terms have the same meaning as in Part 3—
“apparatus” (see sections 89(3) and 105(1));
“in” (in a context referring to apparatus in a street) (see section 105(1));
“street” (see section 48(1) and (2));
“undertaker” (in relation to apparatus or in a context referring to having apparatus in a street) (see sections 48(5) and 89(4)).
(2) In this Part “processing” has the same meaning as in the Data Protection Act 2018 (see section 3(4) of that Act) and “process” is to be read accordingly.”
(2) In section 167 of the New Roads and Street Works Act 1991 (Crown application)—
(a) after subsection (4) insert—
“(4A) The provisions of Part 3A of this Act (National Underground Asset Register: England and Wales) bind the Crown.”;
(b) in subsection (5), for “(4)” substitute “(4) or (4A)”.
(3) Schedule (National Underground Asset Register: monetary penalties) to this Act inserts Schedule 5A into the New Roads and Street Works Act 1991 (monetary penalties).”—(Sir John Whittingdale.)
This amendment inserts Part 3A into the New Roads and Street Works Act 1991 which requires, and makes provision in connection with, the keeping of a register of information relating to apparatus in streets (to be called the National Underground Asset Register).
Brought up, read the First and Second time, and added to the Bill.
New Clause 40
Information in relation to apparatus
“(1) The New Roads and Street Works Act 1991 is amended in accordance with subsections (2) to (6).
(2) For the italic heading before section 79 (records of location of apparatus) substitute “Duties in relation to recording and sharing of information about apparatus”.
(3) In section 79—
(a) for the heading substitute “Information in relation to apparatus”;
(b) in subsection (1), for paragraph (c) substitute—
“(c) being informed of its location under section 80(2),”;
(c) after subsection (1A) (as inserted by section 46(2) of the Traffic Management Act 2004) insert—
“(1B) An undertaker must, except in such cases as may be prescribed, record in relation to every item of apparatus belonging to the undertaker such other information as may be prescribed as soon as reasonably practicable after—
(a) placing the item in the street or altering its position,
(b) inspecting, maintaining, adjusting, repairing, altering or renewing the item,
(c) locating the item in the street in the course of executing any other works, or
(d) receiving any such information in relation to the item under section 80(2).”
(d) omit subsection (3);
(e) in subsection (3A) (as inserted by section 46(4) of the Traffic Management Act 2004)—
(i) for “to (3)” substitute “and (2A)”;
(ii) for “subsection (1)” substitute “this section”;
(f) after subsection (3A) insert—
“(3B) Before the end of the initial upload period an undertaker must enter into NUAR—
(a) all information that is included in the undertaker’s records under subsection (1) on the archive upload date, and
(b) any other information of a prescribed description that is held by the undertaker on that date.
(3C) Where an undertaker records information as required by subsection (1) or (1B), or updates such information, the undertaker must, within a prescribed period, enter the recorded or updated information into NUAR.
(3D) The duty under subsection (3C) does not apply in relation to information recorded or updated before the archive upload date.
(3E) A duty under subsection (3B) or (3C) does not apply in such cases as may be prescribed.
(3F) Information must be entered into NUAR under subsection (3B) or (3C) in such form and manner as may be prescribed.”
(g) in subsection (4)(a), omit “not exceeding level 5 on the standard scale”;
(h) after subsection (6) insert—
“(7) For the purposes of subsection (3B) the Secretary of State must by regulations—
(a) specify a date as “the archive upload date”, and
(b) specify a period beginning with that date as the “initial upload period”.
(8) For the meaning of “NUAR”, see section 106A.”
(4) For section 80 (duty to inform undertakers of location of apparatus) substitute—
“80 Duties to report missing or incorrect information in relation to apparatus
(1) Subsection (2) applies where a person executing works of any description in a street finds an item of apparatus belonging to an undertaker in relation to which prescribed information—
(a) is not entered in NUAR, or
(b) is entered in NUAR but is incorrect.
(2) The person must take such steps as are reasonably practicable to inform the undertaker to whom the item belongs of the missing or incorrect information.
(3) Where a person executing works of any description in a street finds an item of apparatus which does not belong to the person and is unable, after taking such steps as are reasonably practicable, to ascertain to whom the item belongs, the person must—
(a) if the person is an undertaker, enter into NUAR, in such form and manner as may be prescribed, prescribed information in relation to the item;
(b) in any other case, inform the street authority of that information.
(4) Subsections (2) and (3) have effect subject to such exceptions as may be prescribed.
(5) A person who fails to comply with subsection (2) or (3) commits an offence.
(6) A person who commits an offence under subsection (5) is liable on summary conviction to a fine not exceeding level 4 on the standard scale.
(7) Before making regulations under this section the Secretary of State must consult—
(a) such representatives of persons likely to be affected by the regulations as the Secretary of State considers appropriate, and
(b) such other persons as the Secretary of State considers appropriate.
(8) For the meaning of “NUAR”, see section 106A.”
(5) Before section 81 (duty to maintain apparatus) insert—
“Other duties and liabilities of undertakers in relation to apparatus”.
(6) In section 104 (regulations), after subsection (1) insert—
“(1A) Before making regulations under section 79 or 80 the Secretary of State must consult the Welsh Ministers.
(1B) Regulations under this Part may make supplementary or incidental provision.”
(7) In consequence of the provision made by subsection (4), omit section 47 of the Traffic Management Act 2004.”—(Sir John Whittingdale.)
This amendment amends the New Roads and Street Works Act 1991 so as to impose new duties on undertakers to keep records of, and share information relating to, apparatus in streets; and makes amendments consequential on those changes.
Brought up, read the First and Second time, and added to the Bill.
New Clause 41
Pre-commencement consultation
“A requirement to consult under a provision inserted into the New Roads and Street Works Act 1991 by section (National Underground Asset Register) or (Information in relation to apparatus) may be satisfied by consultation before, as well as consultation after, the provision inserting that provision comes into force.”—(Sir John Whittingdale.)
This amendment provides that a requirement that the Secretary of State consult under a provision inserted into the New Roads and Street Works Act 1991 by the new clauses inserted by Amendments NC39 and NC40 may be satisfied by consultation undertaken before or after the provision inserting that provision comes into force.
Brought up, read the First and Second time, and added to the Bill.
New Clause 42
Transfer of certain functions to Secretary of State
“(1) The powers to make regulations under section 79(1) and (2) of the New Roads and Street Works Act 1991, so far as exercisable in relation to Wales, are transferred to the Secretary of State.
(2) The power to make regulations under section 79(1A) of that Act (as inserted by section 46(2) A42of the Traffic Management Act 2004), so far as exercisable in relation to Wales, is transferred to the Secretary of State.
(3) The Street Works (Records) (England) Regulations 2002 (S.I. 2002/3217) have effect as if the reference to England in regulation 1(2) were a reference to England and Wales.
(4) The Street Works (Records) (Wales) Regulations 2005 (S.I. 2005/1812) are revoked.”—(Sir John Whittingdale.)
This amendment provides that certain powers to make regulations under section 79 of the New Roads and Street Works Act 1991, so far as exercisable in relation to Wales, are transferred from the Welsh Ministers to the Secretary of State; and makes provision in relation to regulations already made under those powers.
Brought up, read the First and Second time, and added to the Bill.
Clause 5
Lawfulness of processing
Amendment proposed: 11, page 7, line 12, at end insert—
““internal administrative purposes”, in relation to special category data, means the conditions set out for lawful processing in paragraph 1 of Schedule 1 of the Data Protection Act 2018.”—(Kate Osborne.)
This amendment clarifies that the processing of special category data in employment must follow established principles for reasonable processing, as defined by paragraph 1 of Schedule 1 of the Data Protection Act 2018.
Question put, That the amendment be made.
“(ga)a mayor for the area of a combined county authority established under section 9 of the Levelling-up and Regeneration Act 2023 | section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67)” |
“(ga) a mayor for the area of a combined county authority established under section 9 of the Levelling-up and Regeneration Act 2023 | section 118A of the Representation of the People Act 1983, as applied by the Combined Authorities (Mayoral Elections) Order 2017 (S.I. 2017/67)”. |
I beg to move, That the Bill be now read the Third time.
This Bill will deliver tangible benefits to British consumers and businesses alike, which would not have been possible if Britain had still been a member of the European Union. It delivers a more flexible and less burdensome data protection regime that maintains high standards of privacy protection while promoting growth and boosting innovation. It does so with the support of the Information Commissioner, and without jeopardising the UK’s European Union data adequacy.
I would like to thank all Members who contributed during the passage of the Bill, and all those who have helped get it right. I now commend it to the House on its onward passage to the other place.
I, too, would like to thank the Clerks for their help. They are always enormously helpful, especially to Opposition Members, and sometimes to Government Members as well. I would like to commend my close friend, my hon. Friend the Member for Barnsley East (Stephanie Peacock), who took the Bill through Committee for our side. I think the Minister suggested that it was rather more fun having her up against him than me, which was very cruel and unkind of him.
We support the Bill, although I suspect that regulatory divergence is a bit of a chimera, and that regulatory convergence in this field will give UK businesses greater stability and certainty, but that is for another day. I also worry about the extensive powers that Ministers are giving themselves, and the suggestion that they will switch off the rules on direct marketing in the run-up to a general election. Then there is new schedule 1. I repeat the offer I have made several times, which is that we stand ready to knock that into far better shape, whether in meetings we have privately or through our colleagues in the House of Lords. I feel ashamed to say it, but I hope the Lords are able to do the line-by-line scrutiny that we have been prevented from doing today.
The Minister said that this Bill would not have been possible without Brexit. I think the expression he was looking for is that this Bill would not have been necessary if it had not been for Brexit. This is yet another example of the Government having to play catch-up and having to get themselves out of the holes they dug themselves into through an ill-thought-out Brexit and driving for the hardest possible exit from the European Union.
That said, I do want to echo the thanks given and the tributes paid to the Bill team, and to the Clerks, who have had to work particularly hard in recent days given the significant number of Government amendments tabled at the last minute. I also thank my hon. Friend the Member for Glasgow North West (Carol Monaghan) for her work on Second Reading and in Committee, as well as our research team, especially Josh Simmons-Upton and the many stakeholders who have provided briefings and research, particularly the team at the Public Law Project, who have done excellent work in drawing out some of the most concerning aspects of the Bill. It always concerned me when the briefings came in, entitled “PLP briefing”—I did a doubletake as I thought I was on somebody else’s mailing list.
Although some of what is in the Bill is necessary, particularly following the UK’s withdrawal from the European Union, much of it represents a further power grab by the Executive and risks doing exactly the opposite of what the Government say they want it to achieve: making life easier for business, and improving public confidence in data handling and the use of artificial intelligence.
The SNP will oppose the Bill, and the Government should take the opportunity to start from scratch with a process that listens to consultation responses and involves genuine and detailed parliamentary scrutiny. If the Bill proceeds to the Lords, it will once again fall to the unelected House to more fully interrogate it. That will no doubt lead to several rounds of ping-pong in due course, almost certainly as a result of amendments both from the Government and from the Opposition or Cross-Benchers in the Upper House. That is sub-optimal, as is the case with so much of what seems to happen down here these days. The sooner Scotland has power over this area, and indeed all aspects of legislation, as an independent country, the better.
Question put, That the Bill be now read the Third time.
(12 months ago)
Lords Chamber(11 months, 2 weeks ago)
Lords ChamberMy Lords, in a time of rapid technological change, we need people to trust in how we can use data for greater good. By building understanding and confidence in the rules surrounding how we use data, we can unlock its real potential, not only for businesses but for people going about their everyday lives.
In 2018 Parliament passed the Data Protection Act, which was the UK’s implementation of the EU general data protection regulation. While the EU GDPR protected the privacy rights of individuals, there were unintended consequences. It resulted in high costs and a disproportionate compliance burden for small businesses. These reforms deliver on the Government’s promise to use the opportunity afforded to us by leaving the European Union to create a new and improved UK data rights regime.
The Bill has five parts that deliver on individual elements of these reforms. Part 1 updates and simplifies the UK GDPR and DPA 2018 to ease compliance burdens on businesses and introduce safeguards from new technologies. It also updates the similar regimes that apply to law enforcement agencies and intelligence services. Part 2 enables DSIT’s digital verification services policy, giving people secure options to prove their identity digitally across different sectors of the economy if they choose to do so. Part 3 establishes a framework to set up smart data schemes across the economy. Part 4 reforms the privacy and electronic communications regulations—PECR—to bring stronger protection for consumers against nuisance calls. It also contains reforms to ensure the better use of data in health and adult social care, law enforcement and security. Part 5 will modernise the Information Commissioner’s Office by making sure that it has the capabilities and the powers to tackle organisations that breach data rules, giving the ICO freedom to better allocate its resources and ensuring that it is more accountable to Parliament and to the public.
I stress that the Bill will continue to maintain the highest standards of data protection that people rightly expect. It will also help those who use our data to make our lives healthier, safer and more prosperous. That is because we have convened industry leaders and experts to codesign the Bill with us throughout its creation. This legislation will ensure that our regulation reflects the way in which real people live their lives and run their businesses.
On Report in the other place, we tabled a number of amendments to strengthen the fundamental elements of the Bill and to reflect the Government’s commitment to unleash the power of data across our economy and society. I take this opportunity to thank Members of Parliament and the numerous external stakeholders who have worked with us to ensure that the Bill functions at its absolute best. Taken together, these amendments will benefit the economy by £10.6 billion over 10 years. This is more than double the estimated impact of the Bill when introduced in the spring.
These reforms are expected to lower the compliance burden on businesses. We expect small and micro-businesses to achieve greater overall compliance cost savings than larger business. We expect these compliance cost savings for small and micro-business compliance to be approximately £90 million a year as a result of the domestic data protection policies in the Bill.
The Bill makes it clear that the amount that any organisation needs to do to comply and demonstrate compliance should be directly related to the risk its processing activities pose to individuals. That means that in the future, organisations will have to keep records of their processing activities, undertake risk assessments and designate senior responsible individuals to manage data protection risks only if their processing activities are likely to pose high risks to individuals. We are also removing the need for organisations to do detailed legitimate interest assessments and document the outcomes when their activities are clearly in the public interest—for example, when they are reporting child safeguarding concerns. This will help reduce the amount of privacy paperwork and allow businesses to invest time and resources elsewhere.
Let me make this absolutely clear: enabling more effective use of data and ensuring high data protection standards are not contradictory objectives. Businesses need to understand and to trust in our data protection rules, and that is what these measures are designed to achieve. At the same time, people across the UK need to fundamentally trust that the system works for them too. We know that lots of organisations already have good processes for how they deal with data protection complaints, and it is right that we strengthen this. By making these a requirement, the Bill helps data subjects exercise their rights and directly challenge organisations they believe are misusing their data.
We already have a world-leading independent regulator, the Information Commissioner’s Office. It is only right that we continue to provide the ICO with the tools it needs to keep pace with our dramatically changing tech landscape. The ICO needs to keep our personal data safe while ensuring that it remains accountable, flexible and fit for the modern world. We are modernising the structure and objectives of the Information Commissioner’s Office. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also need to consider how it can empower businesses and organisations to drive growth and innovation across the UK and support public trust and confidence in the use of personal data. We must ensure that our world-leading regulator is equipped to tackle the biggest and most important threats and data breaches, protecting individuals from the highest harm. The Bill means that the ICO can take a more proportionate approach to how it gets involved in individual disputes, not having to do so too early in the process before people have had a chance to resolve things sensibly themselves, while still being the ultimate guardian of data subjects’ rights.
The Bill will create a modern ICO that can tackle the modern, more sophisticated challenges of today and support businesses across the UK to make safe, effective use of data to grow and to innovate. It will also unlock the potential of transformative technologies by making sure that organisations know when they can use responsible automated decision-making and that people know when they can request human intervention where these decisions impact their lives.
Alongside this, there are billions of pounds to be seized in the booming global data-driven trade. With the new international transfers regime, we are clarifying our regime for building data bridges to secure the close, free and safe exchange of data with trusted allies. Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliant mechanisms they already use, avoiding needless checks and costs.
The Bill will allow people to control more of their data. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking, where consumers and businesses access innovative services to manage their finances and spending, track their carbon footprint or access credit. Open banking is already estimated to have the potential to bring in £12 billion each year for consumers and £6 billion for small businesses, as well as boosting innovation in our world-leading fintech industry. With this Bill, we can extend the same benefits for consumers and business across the economy.
Another way the Bill ensures that people have control of their own data is by making it easier and more secure for people to prove things about themselves. Digital identities will help those who choose to use them to prove their identity electronically rather than always having to dig out stacks of physical documents such as passports, bills, statements and birth certificates. Digital verification services are already in existence and we want to put them on a secure and trusted footing, giving people more choice and confidence as they navigate everyday tasks, and saving businesses time and money.
The Bill supports the growing demand, domestic and global, for secure and trusted electronic transactions such as qualified electronic signatures. It also makes provision for the preservation of important data for coronial investigations in the event of a child taking their own life. Any death of a child is a tragedy, and the Government have the utmost sympathy for families affected by this tragic issue. I recognise, and I share, the strong feelings on this issue expressed by noble Lords on this matter and during the passage of the Online Safety Act.
The new provision requires Ofcom, following notification from a coroner, to issue data preservation notices requiring relevant tech companies to hold data that they may have relating to a deceased child’s use of online services in circumstances where the coroner suspects that the child has taken their own life. This greatly strengthens Ofcom’s and a coroner’s ability to access data from online services and provides them with the tools they need to carry out their job. It will include, for example, if a child had taken their own life after interacting with self-harm or other harmful content online, or if they suspect that a child may have been subjected to coercion, online bullying or harassment. It would also include cases where a child has done an intentional act that has caused their death but where they may not have intended to die, such as the tragic circumstances where a child dies accidentally when attempting to recreate an online challenge.
The new provisions do not cover children’s deaths caused by homicide, because the police already have extensive investigative powers in this context. These were strengthened last year by the entry into force of the UK-US data access agreement, which enables law enforcement to directly access content of communications held by US-based companies for the purpose of preventing, detecting, investigating and prosecuting serious crimes, such as murder and child sexual abuse and exploitation.
The families who have been courageously campaigning after their children were tragically murdered did not have access to this agreement because it entered into force only last October. To date, 10,000 requests for data have been made under it. However, we understand their concerns, and the Secretary of State, along with Justice Ministers, will work with noble Lords ahead of Committee and carefully listen to their arguments on potential amendments. We absolutely recognise the need to give families the answers they need and to ensure that there is no gap in the law.
Some aspects of the GDPR are very complex, causing uncertainty around how it applies and hampering private and public bodies’ ability to use data as dynamically as they could. The Bill will help scientists make the most of data by ensuring that they can be reused for other related studies. This is achieved by removing burdensome requirements for scientific researchers, so that they can dedicate more time to focus on what they do best. The Bill will also simplify the legal requirements around research and bring legal clarity. This is achieved by transposing definitions of scientific, historical and statistical-purposes research into the operative text.
The Bill will improve the way that the NHS and adult social care organise data to deliver crucial health services in England. It will also improve the efficiency of data protection for law enforcement and national security partners, encouraging better use of personal data to help protect the public. The Bill will save up to 1.5 million hours of police time each year.
The Bill will also allow us to take further steps to safeguard our national security, by addressing risks from hostile agents seeking to access our data or damage our data infrastructure. It will allow the DWP to protect taxpayers’ money from falling into the hands of fraudsters, as part of the DWP’s biggest reform to fraud legislation in 20 years. We know that, over this last year, overpayments to capital fraud and error in universal credit alone were almost £900 million. It is time to modernise and strengthen the DWP’s legislative framework to ensure that it gives those fighting fraud and error the tools that they need and so that it stands up to future challenges.
Through the Bill we are revolutionising the way we install, maintain, operate and repair pipes and cables buried beneath the ground. I am sure we have all, knowingly or not, been impacted by one of the 60,000 accidental strikes on an underground pipe or cable that happen every year. The national underground asset register—NUAR—is a brand new digital map that gives planners and excavators secure and instant access to the data they need, when they need it. This means not only that the safety and lives of workers will no longer be at risk but that NUAR will underpin the Government’s priority to get the economy growing, expediting projects such as new roads, new houses and broadband rollout.
The Bill gives the people using data to improve our lives the certainty that they need. It maintains high standards for protecting people’s privacy, while seeking to maintain the EU’s adequacy decisions for the UK. The Bill is a hugely important piece of legislation and I thank noble Lords across the House for their involvement in and support for the Bill so far. I look forward to hearing their views today and throughout the rest of the Bill’s passage. I beg to move.
My Lords, I start with apologies from my noble friend Lady Jones of Whitchurch, who cannot be with us due to illness. We wish her a speedy recovery in time for Christmas. I have therefore been drafted in temporarily to open for the Opposition, shunting my noble friend Lord Bassam to close for us at the end of the debate. As a result, what your Lordships will now get with this speech is based partly on his early drafts and partly on my own thoughts on this debate—two for the price of one. I reassure your Lordships that, while I am flattered to be in the super-sub role, I look forward to returning to the Back Benches for the remaining stages in the new year.
I remind the House of my technology interests, particularly in chairing the boards of CENTURY Tech and EDUCATE Ventures Research—both companies working with AI in education. I very much welcome the noble Lord, Lord de Clifford, to his place and look forward to his maiden speech.
Just over six years ago, I spoke at the Second Reading of the Data Protection Bill. I said then that:
“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.
For me, that remains the vision. We are grateful to the Minister for setting out in his speech his vision, but it feels to me that one of the Bill’s failings is the weakening of the protection from exploitation that would follow if it passes in its current form. In that 2017 Second Reading speech, I also said that:
“No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead”.—[Official Report, 10/10/17; cols. 183-5.]
Now that we have moved squarely into the age of AI, I welcome the opportunity to update GDPR to properly regulate data capture, storage and sharing in the public interest.
In the Online Safety Act, we strengthened Ofcom to regulate technology providers and their algorithmic impacts. In the Digital Markets, Competition and Consumers Bill, we are strengthening the Competition and Markets Authority to better regulate these powerful acquisitive commercial interests. This Bill is the opportunity to strengthen the Information Commissioner to better regulate the use of data in AI and some of the other potential impacts discussed at the recent AI summit.
This is where the Bill is most disappointing. As the Ada Lovelace Institute tells us in its excellent briefing, the Bill does not provide any new oversight of cutting-edge AI developments, such as biometric technologies or foundation models, despite well-documented gaps in existing legal frameworks. Will the Minister be coming forward with anything in Committee to address these gaps?
While we welcome the change from an Information Commissioner to a broader information commission, the Bill further weakens the already limited legal safeguards that currently exist to protect individuals from AI systems that make automated decisions about them in ways that could lead to discrimination or disadvantage—another lost opportunity.
I co-chair the All-Party Parliamentary Group on the Future of Work, and will be seeking to amend the Bill in respect of automated decision-making in the workplace. The rollout of ChatGPT-4 now makes it much easier for employers to quickly and easily develop algorithmic tools to manage staff, from hiring through to firing. We may also want to provide safeguards over public sector use of automated decision-making tools. The latter is of particular concern when reading the legal opinion of Stephen Cragg KC on the Bill. He says that:
“A list of ‘legitimate interests’ (mostly concerning law and order, safeguarding and national security) has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned … The Secretary of State can add to this list without the need for primary legislation, bypassing important Parliamentary controls”.
Furthermore, on lost opportunities, the Bill does not empower regulators with the tools or capabilities that they need to implement the Government’s plans for AI regulation or the commitments made at the AI Safety Summit. In this, I personally support the introduction of a duty on all public regulators to have regard to the principles on AI that were published in the Government’s White Paper. Would the Minister be willing to work with me on that?
There are other lost opportunities. I have argued elsewhere that data trusts are an opportunity to build public trust in their data being used to both develop better technology and generate revenue back to the taxpayer. I remain interested in whether personal data could be defined as an asset that can be bequeathed in one’s estate to avoid what we discussed in our debates on what is now the Online Safety Act, where bereaved families have had a terrible experience trying to access the content their children saw online that contributed to their deaths—and not just from suicide.
This takes me neatly on to broken promises and lessons not learned. I am confident that, whether the Government like it or not, the House will use this Bill to keep the promises made to families by the Secretary of State in respect of coroners being able to access data from technology providers in the full set of scenarios that we discussed, not just self-harm and suicide. It is also vital that the Bill does nothing to contradict or otherwise undermine the steps that this country has taken to keep children safe in the digital world. I am sure we will hear from the noble Baroness, Lady Kidron, on this subject, but let me say at this stage that we support her and, on these Benches, we are fully committed to the age-appropriate design code. The Minister must surely know that in this House, you take on the noble Baroness on these issues at your peril.
I am also confident that we will use this Bill to deliver an effective regime on data access for researchers. During the final parliamentary stages of the Online Safety Bill, the responsible Ministers, Paul Scully MP and the noble Lord, Lord Parkinson, recognised the importance of going further on data access and committed in both Houses to exploring this issue and reporting back on the scope to implement it through other legislation, such as this Bill. We must do that.
The Bill has lost opportunities and broken promises, but in other areas it is also failing. The Bill is too long—probably like my speech. I know that one should not rush to judgment, but the more I read the Bill and various interpretations of its impact, the more I worry about it. That has not been helped by the tabling of some 260 government amendments, amounting to around 150 pages of text, on Report in another place—that is, after the Bill had already undergone its line-by-line scrutiny by MPs. Businesses need to be able to understand this new regime. If they also have any data relationship with the EU, they potentially also need to understand how this regime interacts with the EU’s GDPR. On that, will the Minister agree to share quickly with your Lordships’ House his assessment of whether the Bill meets the adequacy requirements of the EU? We hear noises to the contrary from the Commission, and it is vital that we have the chance to assess this major risk.
After the last-minute changes in another place, the Bill increasingly seems designed to meet the Government’s own interests: first, through changes to rules on direct marketing during elections, but also by giving Ministers extensive access to the bank account data of benefit claimants and pensioners without spelling out the precise limitations or protections that go alongside those powers. I note the comments of the Information Commissioner himself in his updated briefing on the Bill:
“While I agree that the measure is a legitimate aim for government, given the level of fraud and overpayment cited, I have not yet seen sufficient evidence that the measure is proportionate ... I am therefore unable, at this point, to provide my assurance to Parliament that this is a proportionate approach”.
In starting the scrutiny of these provisions, it would be useful if the Minister could confirm in which other countries such provisions already exist. What consultation have they been subject to? Does HMRC already have these powers? If not, why go after benefit fraud but not tax fraud?
Given the lack of detailed scrutiny this can ever have in the other place, I of course assume the Government will respect whatever is the will of this House when we have debated these measures.
As we did during last week’s debate on the Digital Markets, Competition and Consumers Bill, I will now briefly outline a number of other areas where we will be seeking changes or greater clarity from the Government. We need to see a clear definition of high-risk processing in the Bill. While the Government might not like subject access requests after recent experience of them, they have not made a convincing case for significantly weakening data-subject rights. Although we support the idea of smart data initiatives such as extending the successful open banking framework to other industries, we need more information on how Ministers envisage this happening in practice. We need to ensure the Government’s proposals with regards to nuisance calls are workable and that telecommunications companies are clear about their responsibilities. With parts of GDPR, particularly those on the use of cookies, having caused so much public frustration, the Bill needs to ensure appropriate consultation on and scrutiny of future changes in this area. We must take the public with us.
So a new data protection Bill is needed, but perhaps not this one. We need greater flexibility to move with a rapidly changing technological landscape while ensuring the retention of appropriate safeguards and protections for individuals and their data. Data is key to future economic growth, and that is why it will be a core component of our industrial strategy. However, data is not just for growth. There will be a clear benefit in making data work for the wider social good and the empowerment of working people. There is also, as we have so often discussed during Oral Questions, huge potential for data to revitalise the public services, which are, after 13 years of this Government, on their knees.
This Bill seems to me to have been drafted before the thinking that went into the AI summit. It is already out of date, given its very slow progress through Parliament. There is plenty in the Bill that we can work with. We are all agreed there are enormous opportunities for the economy, our public services and our people. We should do everything we can to take these opportunities forward. I know the Minister is genuinely interested in collaborating with colleagues to that end. We stand ready to help the Government make the improvements that are needed, but I hope the Minister will acknowledge that there is a long way to go if this legislation is to have public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy too. We must end the confusion, empower the regulators and in turn empower Parliament to better scrutinise the tsunami of digital secondary legislation coming at us. There is much to do.
My Lords, even less than the noble Lord, Lord Knight, can I claim that this is my primary brief, so I want to make a short Back-Bench contribution to the subject, bringing some of my experience from former interests. I declare that I do not have any current financial interests but, if you look at my register entry, you will see that I spent a long time working for a company that was so much at the heart of the data protection debate that the 2016 EU regulation was nicknamed in Brussels “Lex Facebook”.
I do not want speak to the details of the provisions in front of us, and I look forward to hearing some of the arguments, particularly from the noble Baroness, Lady Kidron, with whom I worked closely in the context of the Online Safety Act; I think she has some really important points to raise on what is in the Bill. I also look forward to the maiden speech of the noble Lord, Lord de Clifford.
The one thing I really want to spend a short amount of time on today is to flag a concern that I will not attempt to resolve: I would rather leave that to my noble friend Lord Clement-Jones and others who will to be in Committee on the Bill. It is the concern around EU adequacy that I think should really be front and centre of our discussions when we consider this legislation. As I say, I do not intend to be active in later stages of the Bill—unless we fix the NHS between now and Committee, which would be a blessing for more reasons other than enabling me to take part in consideration of data protection legislation.
The flag that I am raising will be in something of a Cassandra-like tone. It is something I think is very likely to happen, but I am not expecting the Government to believe me and necessarily change direction. I have been intimately involved in these discussions over many years. If people have been following this, they will know that the EU had an adequacy agreement with the United States that had full political support within the EU institutions but has successively been struck down in a series of actions in the European Court of Justice. All the politicians wanted data to flow freely between the United States and the EU, but the law has not allowed that to happen. So the alarm bells ring. The noble Lord, Lord Knight of Weymouth, said he thought the Commission had doubts; that worries me even more. Even where the Commission is saying that it is comfortable with the adequacy of the UK regime, the alarm bells still ring for me because it said that repeatedly over the US data transfers and it turned out not to be the case.
There are three main areas where we can predict that the risk will occur. The first is where the core legal regime for data protection in the UK is deemed to be too weak to protect the interests of EU data subjects. The second is where there are aspects of the UK legal regime for security-related surveillance that are seen as creating unacceptable risk if EU data is in the hands of UK entities. The third is where redress mechanisms for EU data subjects, especially in relation to surveillance, are regarded as inaccessible or ineffective. These are all the areas that have been tested thoroughly in the context of the United States, and any or all of them may end up being tested also in the European Court of Justice for the United Kingdom if EU citizens complain in future about the processing of their data in the UK. The first angle will test the complete package of data protection set out in the many pages of this Bill. The second will consider our surveillance practices, including new developments such as the Investigatory Powers (Amendment) Bill, which is before us right now. Any future changes to UK surveillance law, for example, following a terrorist outrage, may end up being tested and queried before the European Court of Justice.
Regarding redress, our relationship with the European Court of Human Rights is critical. Any suggestion that we start to ignore ECHR judgments, even in another area such as immigration policy, may be used to argue that EU citizens cannot rely on their Article 8 right to privacy in the United Kingdom. My advice to the Minister is to properly test all these angles internally on the assumption that we will be arguing them out at the European Court of Justice in the future. This is difficult. I know that the UK authorities, like the US authorities, will not be comfortable sharing details of their surveillance regime in a European court, but that is what will be required to prove we are adequately safe if a complaint in respect of UK surveillance is made. It is really important that we hear the strongest lines of attack, and that we invite privacy activists, in particular, to offer them: the Government should invite in the kinds of people who will be taking those court cases so they can hear their strongest lines of attack now and test all our legislation against them. We certainly should not rely on assurances from the European Commission; I hope the Minister can give us more than that in his response. The key dynamic from the transatlantic experience is that this is between EU privacy activists and the European courts, rather than being something the Commission entirely controls.
The consequences of the loss of EU adequacy, or even significant uncertainty that this is on the horizon, will be that UK businesses that work on a cross-channel basis will be advised by their lawyers to move their data processing capability into the EU. They would feel confident serving the UK from the EU, but not the other way around. This is precisely what has happened in the context of transatlantic data flows and will hardly make Britain the best place in the world to do e-business. I hope the Minister will confirm that it would be a very undesirable outcome, to use parliamentary language, and that we will be taking one step forward but two steps back if that is a consequence of this Bill.
Having planted that flag, it is regrettable I will be unable to help noble Lords as they try and thread the needle of getting the legislation right. I have every sympathy for those seeking to do that; I have less and less sympathy for the Government, because they chose to bring the legislation forward, unlike other important legislation like the mental capacity Bill, which was left off the agenda, as I keep reminding the Government. I hope noble Lords will keep this Cassandra-like warning current in their minds as they consider the Bill; I do not want to be standing here in five years’ time saying, “I told you so” and I do not think noble Lords want me here in five years’ time saying that either. With that in your Lordships’ ears, I hope the Minister and Members who are scrutinising the Bill can really dig into this adequacy point and not hold back, because it is a genuine, serious threat to all kinds of businesses in the United Kingdom, not just digital ones.
My Lords, I declare my interests set out in full on the register, including as an advisor to the Institute for Ethics in AI at Oxford University, chair of the Digital Futures for Children centre at the LSE and chair of the 5Rights Foundation. I add my welcome to my noble friend Lord de Clifford, who I had the pleasure of meeting yesterday, and I look forward to his maiden speech.
I start by quoting Marcus Fysh MP who said in the other place:
“this is such a serious moment in our history as a species. The way that data is handled is now fundamental to basic human rights … I say to those in the other place as well as to those on the Front Benches that … we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right”.—[Official Report, Commons, 29/11/23; col. 878.]
He was not the only one on Report in the other place who was concerned about some of the provisions in the Bill, who bemoaned the lack of scrutiny and urged the Government to think again. Nor was he the only one who reluctantly asked noble Lords to send the Bill back to the other place in better shape.
I associate myself with the broader points made by both noble Lords who have already spoken—I do not think I disagreed with a word that they said—but my own comments will primarily focus on the privacy of children, the case for data communities, access for researchers and, indeed, the promises made to bereaved parents and then broken.
During the passage of the Data Protection Act 2018, your Lordships’ House, with cross-party support, introduced the age appropriate design code, a stand-alone data protection regime for the under-18s. The AADC’s privacy by design approach ushered in a wave of design change to benefit children: TikTok and Instagram disabled direct messaging from unknown adults to children; YouTube turned off auto-play; Google turned on safe search on by default for children; 18-plus apps were taken out of the Play Store; TikTok stopped notifications through the night; and Roblox stopped tracking and targeting children for advertising. These were just a handful of hundreds of changes to products and services likely to be accessed by children. Many of these changes have been rolled out globally, meaning that while other jurisdictions cannot police the code, children in those places benefit from it. As the previous Minister, the noble Lord, Lord Parkinson, acknowledged, it contributes to the UK’s reputation for digital regulation and is now being copied around the globe.
I set this out at length because the AADC not only drove design change, it also established the crucial link between privacy and safety. This is why it is hugely concerning that children have not been explicitly protected from changes that lessen user data protections in the Bill. I have given Ministers notice that I will seek to enshrine the principle that children have the right to a higher bar of data protection by design and default; to define children’s data as sensitive personal data in the Bill; and exclude children from proposals that risk eroding the impact of the AADC, notably in risk assessments, automated processing, onward processing, direct marketing and the extended research powers of commercial companies.
Minister Paul Scully said at Second Reading in the other place:
“We are committed to protecting children and young people online … organisations will still have to abide by our Age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
I take it from those words that any perception of, or diminution to, children’s data rights is inadvertent, and it remains the Government’s policy not to weaken the AADC as currently configured in the Bill. Will the Minister confirm that it is indeed the Government’s intention to protect the AADC and that he is willing to work with me to ensure that it is that the outcome? I will also seek a requirement for the ICO to create a statutory children’s code in relation to AI. The ubiquitous deployment of AI technology to recommend and curate is nothing new, but the rapid advances in generative AI capabilities marks a new stage in its evolution. In the hundreds of pages of the ICO’s non-binding Guidance on AI and Data Protection, its AI and Data Protection Risk Toolkit and its advice to developers on generative AI, there is but one mention of the word “child”—in a case study about child benefit.
The argument made was that children are covered by the AADC, which underlines again just how consequential it is. However, since adults are covered by data law but it is considered necessary to have specific AI guidance, the one in three users that is under 18 deserves the same consideration. I am not at liberty to say today, but later this week—perhaps as early as tomorrow—information will emerge that underlines the urgent need for specific consideration of children’s safety in relation to generative models. I hope that the Minister will agree that an AI code for kids is an imperative rather than nice to have.
Similarly, we must deliver data privacy to children in education settings. Given the extraordinary rate at which highly personal data seeps out of schools into the commercial world, including to gambling companies and advertisers, coupled with the scale of tech adoption in schools, it is untenable to continue to see tech inside school as a problem for schools and tech outside school as a problem for regulators. The spectre of a nursery teacher having enough time and knowledge to integrate the data protection terms of a singing app, or the school ICT lead having to tackle global companies such as Google and Microsoft to set the terms for their students’ privacy, is frankly ridiculous, but that is the current reality. Many school leaders feel abandoned by the Government’s insistence that they should be responsible for data protection when both the AADC and Online Safety Act have been introduced but they benefit from neither. It should be the role of the ICO to set data standards for edtech and to ensure that providers are held to account if they fall short. As it stands, a child enjoys more protection on the bus to school than in the classroom.
Finally on issues relating to children, I want to raise a technical issue around the production of AI-generated child sexual abuse material. I recognise the Government’s exemplary record on tackling CSAM but, unfortunately, innovation does not stop. While AI-generated child sexual abuse content is firmly in scope of UK law, it appears that the models or plug-ins trained on generating CSAM or trained to generate CSAM are not. At least four laws, the earliest from 1978, are routinely used to bring criminal action against CSAM and perpetrators of it, so I would be grateful if the Minister would agree to explore the issue with the police unit that has raised it with me and make an explicit commitment to close any gaps identified.
We are at an inflection point, and however esoteric and arcane the issues around data appear to be, to downgrade a child’s privacy even by a small degree has huge implications for their safety, identity and selfhood. If the Government fail to protect and future-proof children’s privacy, they will be simply giving with one hand in the OSA and taking away with the other in this Bill.
Conscious that I have had much to say about children, I will briefly put on the record issues that we can debate at greater length in Committee. While data law largely rests on the assumption of a relationship between an individual and a service, we have seen over a couple of decades that power lies in having access to large datasets. The Bill offers a wonderful opportunity to put that data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes. I have been inspired by approaches coming out of academia and the third sector which have supported the drafting of amendments to find a route that would enable the sharing of data rights.
Similarly, as the noble Lord, Lord Knight, said, we must find a route to access commercial data sets for public interest research. I was concerned that in the other place when former Secretary of State Jeremy Wright queried why a much-touted research access had not materialised in the Bill, the Minister appeared to suggest that it was covered. The current drafting embeds the asymmetries of power by allowing companies to access user data, including for marketing and creating new products, but does not extend access for public interest research into the vast databases held by those same companies. There is a feeling of urgency emerging as our academic institutions see their European counter- parts gain access to commercial data because of the DSA. There is an increased need for independent research to support our new regulatory regimes such as the Online Safety Act. This is an easy win for the Government and I hope that they grasp it.
Finally, I noted very carefully the words of the Minister when he said, in relation to a coroner’s access to data, that the Secretary of State had made an offer to fill the gap. This is a gap that the Government themselves created. During the passage of the Online Safety Act we agreed to create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child. The Government have reneged by narrowing the scope to those children taking their own life. Expert legal advice says that there are multiple scenarios under which the Government’s narrowing scope creates a gaping hole in provision for families of murdered children and has introduced uncertainty and delay in cases where it may not be clear how a child died at the outset.
I must ask the Minister what the Government are trying to achieve here and who they are trying to please. Given the numbers, narrowing scope is unnecessary, disproportionate and egregiously inhumane. This is about parents of murdered children. The Government lack compassion. They have created legal uncertainty and betrayed and re-traumatised a vulnerable group to whom they made promises. As we go through this Bill and the competition Bill, the Minister will at some points wish the House to accept assurances from the Dispatch Box. The Government cannot assure the House until the assurances that they gave to bereaved parents have been fulfilled.
I will stop there, but I urge the Minister to respond to the issues that I have raised rather than leave them for another day. The Bill must uphold our commitment to the privacy and safety of children. It could create an ecosystem of innovative data-led businesses and keep our universities at the forefront of tech development and innovation. It simply must fulfil our promise to families who this Christmas and every other Christmas will be missing a child without ever knowing the full circumstances surrounding that child’s death. That is the inhumanity that we in this House promised to stop—and stop it we must.
My Lords, I too welcome the noble Lord, Lord de Clifford, and look forward to his maiden speech. We on these Benches appreciate that there is a need for updated data protection legislation in order to keep up with the many technological advances that are taking place and, wherever possible, to simplify the processes for data processing. From this perspective, we welcome the Government’s ambition to remove unnecessary red tape and to support British businesses and our economy. However, as ever, these priorities need to be balanced alongside appropriate security of new legislation and we must ensure that there are appropriate safeguards in the Bill to protect human rights that are fundamental to our democracy.
I have been struck by just how many briefing papers I have received from the most extraordinarily diverse group of organisations. One thing that many of them highlight is the fact that, for many businesses that operate between the UK and the EU, this new legislation is no guarantee of simplified data processing. In fact, with the increased divergence between UK and EU data protection that this Bill will bring, it is worrying that we may struggle to work more closely with the EU. Working to two different standards and trying to marry two frameworks that are far less aligned does not sound like less red tape, nor does it sound particularly pro-business.
However, there is an important point in respect of the stated aims of the Bill. There are serious concerns from businesses, organisations and civil society groups across a wide range of sectors about the weakening of data protection law under this new Bill. Clause 1(2) tightens the definition of personal data, meaning that only data that could allow a processor or another party to identify the individual by
“reasonable means at the time of processing”
would count as personal data and be protected by law. As many others have drawn attention to, the use of the phrase “reasonable means” is imprecise and troubling. This will need to be more clearly defined as a minimum or the clause revoked altogether. “Reasonable means” would include the cost of identifying the individual, as well as the time, effort and other factors besides. This would allow organisations to assess whether they have the resources to identify an individual, which would be an extremely subjective test, to say the least, and puts the power firmly in the hands of data processors when it comes to defining what is or is not personal data.
As an example, GeneWatch has highlighted that, under the new Bill, some genetic information will no longer be classed as “personal data” and safeguarded as such, allowing the police and security services to access huge amounts of the public’s genetic information without needing to go to court or to justify the requirement for this data. Crucially, data protection legislation should define what is or is not personal data by the type of data it is, not by how easy or feasible it may be for an organisation or third party to use that data to identify an individual at every given point. Personal data rights must continue to be protected in this country and in our law.
The new Bill also provides vastly expanded powers to the police and security services via Clause 19 and Clauses 28 to 30. As I read them, on the surface they do not look as though they provide proper accountability; perhaps the Minister can reassure me on that. Clause 19 would review the requirement in the Data Protection Act 2018 for the police to justify why they have accessed an individual’s personal data. Clauses 28 to 30 allow the Home Secretary to authorise the police so that they do not need to comply with certain data protection laws via a national security certificate; this would give the police immunity even if they commit what would otherwise be a crime.
Taken together, these two measures give an extraordinary amount of unchecked power to the police and security services. With the amended approach to national security certificates, the police could not be challenged before the courts for how and why they had accessed data, so there would be no way to review what the Government are doing here or ensure that abuses of these powers do not take place. Can the Minister explain how such measures align with the democratic values on which this country and government are based?
The National AIDS Trust has been involved in cases where people living with HIV have had their HIV status shared, without their consent, by police officers, with a huge impact on the life of the individual in question. This is a serious breach of current data protection law. We must ensure that police officers are still required to justify why they have accessed specific personal data, as this evidence is vital in cases of police misconduct.
I am aware that there are many other concerns about this Bill. Noble Lords have touched on some of them, not least around online pornography, gambling and other matters that I hope other noble Lords will pick up on. In particular, there are doubts around the Bill’s compliance with the European Convention on Human Rights. We in this House must do our duty to properly scrutinise and, wherever necessary, amend this Bill to ensure that we have the proper legislation in place to protect and safeguard our data. I look forward to working with Ministers and Members of this House when we move into Committee on this Bill.
My Lords, it is a pleasure to follow the previous speakers, including my noble friend the Minister, the other Front-Benchers and the noble Baroness, Lady Kidron.
I start by thanking the House of Lords Library for its briefing—it was excellent, as usual—and the number of organisations that wrote to noble Lords so that we could understand and drill down into some of the difficulties and trade-offs we are going to have to look at. As with most legislation, we want to get the balance right between, for example, a wonderful environment for commerce and the right to privacy and security. I think that we in this House will be able to tease out some of those issues and, I hope, get a more appropriate balance.
I refer noble Lords to my interests as set out in the register. They include the fact that I am an unpaid adviser to the Startup Coalition and have worked with a number of think tanks that have written about tech and privacy issues in the past.
When I look at the Bill at this stage, I think that there are bits to be welcomed, bits that need to be clarified and bits that raise concern. I want to touch on a few of them before drilling down—I will not drill down into all of them, because I am sure that noble Lords have spoken or will speak on them, and we will have much opportunity for further debate.
I welcome Clause 129, which requires social media companies to retain information linked to a child suicide. However, I understand and share the concern of the noble Baroness, Lady Kidron, that this seems to be the breaking of a promise. The fact is that this was supposed to be about much more data and harms to children and how we can protect our children. In some ways, we must remember the analogy about online spaces: when we were younger, before the online age, our parents were always concerned about us when we went beyond the garden gate; nowadays, we must look at the internet and the computers on our mobile devices as that garden gate. When children leave that virtual garden gate and go through into the online world, we must ask whether they are safe, in the same way that my parents worried about us when, as children, we went through our garden gate to go out and play with others.
Clauses 138 to 141, on a national underground asset register, are obviously very sensible; that proposal is probably long overdue. I have questions about the open electoral register, in particular the impact on the direct marketing industry. Once again, we want to get the balance right between commerce and ease of doing business, as my noble friend the Minister said, and the right to privacy.
I have concerns about Clauses 147 and 148 on abolishing the offices of the Biometrics Commissioner and the Surveillance Camera Commissioner. I understand that the responsibilities will be transferred, but, in thinking about the legislation that we have been talking about in this place—such as the Online Safety Act—I wonder about the amount of powers that we are giving to these regulators and whether they will have the bandwidth for them. Is there really a good reason for abolishing these two commissioners?
I share the concerns of the noble Lord, Lord Knight, about access to bank accounts. Surely people should have the right to know why their bank account has been accessed and have some protection so that not just anyone can access it. I know that it is not just anyone but there are concerns about this, and people have to be clearer on the rules.
I have talked to the direct marketing industry. It sees the open electoral register as a valuable resource for businesses in understanding and targeting customers. However, it tells me that a recent court case between Experian and the ICO has introduced some confusion on the use of the register for business purposes. It is concerned that the Information Commissioner’s Office’s interpretation, requiring notification to every individual for every issue, presents challenges that could cost the industry millions and make the open electoral register unusable for it, perhaps pushing businesses to rely more on large tech companies. However, I understand that, at the same time, this may well be an issue where there are clear concerns about privacy.
Where there is no harm, I would like to understand the Government’s thinking on some of that—whether it is going too far or whether some clarification is needed in this area. Companies say they will be unable to target prospective customers; some of us may like that, but we should also remember that there is Clause 116 on unlawful direct marketing. The concern for many of us is that while it is junk if we do not want it, sometimes we do respond to someone’s direct marketing. I wonder how we get that balance right; I hope we can tease some of that out. If the Government agree with the interpretation and restrictions on the direct marketing industry, I wonder whether they can explain some of the reasons behind it. There may very well be good reasons.
I also want to look at transparency and data usage, not just for AI but more generally. It is obvious in the Government’s own AI White Paper that we want a pro-innovation approach to regulation, but we are also calling for transparency at a number of levels: of datasets and of algorithms. To be honest, even if we are given that transparency, do we have the ability to understand those algorithms and datasets? We still need that transparency. I am concerned about undermining the principle, and particularly weakening subject access requests.
I am also interested in companies that, say, have used your data but have refused an application and then tell you that they do not have to tell you why they refused that application. Perhaps this is too much of a burden to companies, but I wonder whether we have a right to know which data was being accessed when that decision was made. I will give a personal example; about a year ago, I applied for an account with a very clever online bank and was rejected. It told me I would have a decision within 48 hours; I did not. Two weeks later, I got a message on the app that said I had been rejected and that under the law it did not have to tell me why. I wrote to it and said, “Okay, you don’t have to tell me why, but could you delete all the data you have on me—what I put in?”. It said, “Oh, we don’t have to delete it until a certain time”. If we really own that data, I wonder whether there should be more of an expectation on companies to explain what data and information they have to make those decisions, which can be life changing for many people. We have heard all sorts of stories about access to bank accounts and concerns about digital exclusion.
We really have to think about how much access individuals can have to the data that is used to refuse them, but also the data when they leave a service or stop being a user. I also want to make sure that there is accountability. I want to know, in Clause 12, about “reasonable and proportionate search”; what does that mean, particularly when it is processed by law enforcement and intelligence services? I think we need further clarification on some of this for our assurance.
We also have to recognise that, if we look at the online environment of the last 10, 15 or 20 years, at first we were very happy to give our data away to social media companies because we thought we were getting a free service, connecting with friends across the world et cetera. Only later did we realise that the companies were using this data and monetising it for commercial purposes. There is nothing wrong with that in itself, but we have to ask whose data it is. Is it my data? Does the company own it? For those companies that think they own it, why do they think that? We need some more accountability, to make sure that we understand which data we own and which we give away. Once again, the same thing might happen—you might stop being a user or customer of a service, or you might be rejected, but it is not there.
As an academic, I recognise the need for greater access to data, particularly for online research. I welcome some of the mechanisms in the Online Safety Act that we debated. Does my noble friend the Minister believe that the Bill sufficiently addresses the requirements and incentives for large data holders to hold data for academic research with all the appropriate safeguards in place? I wonder whether the Minister has looked at some of the proposals to allow this to happen more, perhaps with the information commission acting as an intermediary for datasets et cetera. Once again, I am concerned about giving even more power to the information commission and the bandwidth to do all this stuff, including all the powers we are giving.
On cookie consent, I understand the annoyance of cookies. I remember the debates about cookie consent when I was in the European Parliament, but at the time we supported it because we thought it was important for users to be told what was being done with their information. It has become annoying, just like those text messages when we go roaming; I supported that during the roaming debates in the European Parliament because I did not want users to say they were not warned about the cost of roaming. The problem is that they become annoying; people ignore them and tick things on terms and conditions without having read them because they are too long.
When it comes to some of the cookies, I like the idea about exemptions for prior consent—a certain opt-out where there is no real harm—but I wonder whether it could be extended, for example so that cookies to understand the performance of advertising and to help companies understand the effectiveness of advertisements are exempt from the consent requirements. I do not think this would fundamentally change the structure of the Bill, but I wonder whether we have the right balance here on harm, safety and the ability of companies to test the effectiveness of some of their direct marketing. Again, I am just interested in the Government’s thinking about the balance between privacy and commerce.
Like other noble Lords, I share concerns about the powers granted to the Secretary of State. I think they lack the necessary scrutiny and safeguards, and that there is a risk of undermining the operations of online content and service providers that rely on these technologies. We need to see some strengthening here and more assurances.
I have one or two other concerns. The Information Commissioner has powers to require people to attend interviews as part of an investigation; that seems rather Big Brother-ish to me, and I am not sure whether the Information Commissioner would want these abilities, but there might be good reasons. I just want to understand the Government’s thinking on this.
I know that on Report in the other place, both Dawn Butler MP and David Davis MP raised concerns about retaining the right to use non-digital verification systems. We all welcome verification systems, but the committee I sit on—the Communications and Digital Committee—recently wrote a report on digital exclusion. We are increasingly concerned about digital exclusion and people having a different level of service because they are digitally excluded. I wonder what additional assurances the Minister can give us on some of those issues. The Minister in the other place said:
“Individual choice is integral … digital verification services can be provided only at the request of the individual”.—[Official Report, Commons, 29/11/23; col. 913.]
I think that any further verification would be really important.
The last point I turn to is EU adequacy. Let me be quite clear: I do not believe in divergence for the sake of divergence, but at the same time I do not believe in convergence or harmonisation for the sake of convergence and harmonisation. We used to have these debates in the European Parliament all the time. There are those expressing concerns about EU data adequacy, and we have to split them into two groups—one is those people who really still wish we were members of the EU, but there are also those for whom this is irrelevant, and for whom this really is about the privacy and security of our users. If the EU is raising these issues in its agreements, we can thank it for doing that.
I obviously was involved in debates on the safe harbour and the privacy shield. As noble Lords have said, we thought we had the right answer; the Commission thought we had the answer, but it was challenged by courts. I think this will have to be challenged more. Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?
I end by saying that I look forward to the maiden speech of the noble Lord, Lord de Clifford. I thank noble Lords for listening to me, and I look forward to working with noble Lords across the House on some of the issues I have raised.
My Lords, the Bill may contain some good elements in the search for a modernisation of data protection, but in overall terms it seems to tilt the balance of advantage to businesses and government authorities rather than to the individual. It has been marred in its passage by the profusion of late government amendments in the other place on Report, and an absence of scrutiny from the Joint Committee on Human Rights.
There are a number of issues that I think need to be seriously reconsidered. I will focus today on four. I also commend the passion of the noble Baroness, Lady Kidron, on the issues that she raised, some of which I will also touch on.
First, as my noble friend Lord Knight of Weymouth and the noble Lord, Lord Allan of Hallam, said—I do love the noble Lord’s name; alliterative Peers are a wonderful thing—a number of proposals appear to put at risk the free flow of data from the UK to the EU. That has already been touched on. It could even undermine the UK’s data adequacy decision. There seems to be some disconnect between what the EU Commission and the EU Parliament have begun to enunciate as a view: that the new powers of the Secretary of State to meddle with the objective and impartial functioning of the new Information Commission could result in the withdrawal of the UK adequacy decision. There seems to be a disconnect between that and the assurances that Ministers have given so far in the other place. Losing that decision, or even seeming to have that decision at risk, would be pretty disastrous for UK business, our trade and our research collaborations. Can the Minister tell the House how he intends to avoid this in the review due next year? How does he square the concerns of the EU with the assurances given by his ministerial colleagues?
My second point is about the new measures introduced at the last minute in the other place—Clauses 128 and Schedule 11—requiring the banks to monitor continuously all accounts to find welfare recipients and snitch on them if they reach certain as yet unprescribed criteria. This is not just an abstruse issue; it involves a considerable number of people. Knowing the age of the average Peer, it probably involves pretty well everybody in this House, because, of course, it includes pension recipients, so this is of personal concern to all of us. This is legitimising mass surveillance by algorithm. This seems to me to be a major intrusion into the privacy of pretty well all individuals in the UK and, to some extent, an infringement on the confidential relationship that you ought to be able to expect between a bank and its customer.
Can the Minister tell the House why he thinks this Big Brother mechanism is necessary? Why can the problem of benefit fraud not be dealt with in a way that does not mean that all customers are subject to surveillance? What alternatives were considered by Government and rejected? What safeguards will go alongside this provision to prevent it from being typified as a heavy-handed Big Brother approach?
It is strange that pension claimants are included. A pension, in my view, is a right, not a benefit; it was paid for by hard work during one’s working life. The Minister said in another place that they intend to extend this sort of surveillance process to other data areas. Can the Minister tell us what other areas and when that extension might take place?
The third issue is AI safety, an issue that has already been raised by a number of noble Lords. The Government were quite bushy tailed about their recent AI Safety Summit and the commitment to see the UK as a world leader. I am afraid that every time I hear this phrase “a world leader” I have the urge to throw up in my handbag, so you will pardon me if I wrinkle my nose at that. The fact that we want to be somewhere in the front pack on AI safety and responsible and safe AI innovation is okay, but the Bill is a missed opportunity. I agree with my noble friend Lord Knight of Weymouth that the Bill should be the place where oversight challenges posed by a very fast-moving set of AI developments, such as in biometric technologies, needs to have been gripped.
I was a victim of a biometric technology development when I was chancellor of Cranfield University. It developed a process for detecting microscopic and invisible beads of sweat above your eyebrows if you were put under pressure, and it was to be used in cases of airport security and various other areas. They decided to put me under pressure by making me stand in the main square of the university and answer mental arithmetic questions over a loudspeaker. What they had not quite grasped is that I know I am rubbish at mental arithmetic, so it put me under no pressure whatever, because this was not going to be news to anybody. It therefore failed to detect microscopic sweat. I thought you might like the day to be raised by a humorous account in this pre-Christmas process.
The Bill is a real missed opportunity to grasp those AI developments and the safeguarding that needs to go with them. In fact, you could say that it erodes further the already inadequate legal safeguards that should protect individuals from discrimination or disadvantage by AI systems making automated decisions. We have heard about job hiring and loan applications; this is, “The computer says no”, but on speed. We in your Lordships’ House deplore late additions to Bills, although we have rather grown used to it in recent months, but if the summit’s assurances are not going to seem a bit hollow, it would be good to hear whether the Minister intends to introduce additional measures on AI safety in the Bill and, if not, in what other legislation and to what timescale.
The fourth issue I want to raise is that of the role of the Information Commissioner’s Office, soon to be the Information Commission. I entirely approve of the structure of an information commission as opposed to a commissioner. We need a powerful and effective regulator. The ICO’s enforcement and prosecution record has not been sparkling, with low levels of enforcement notices, prosecutions and fines. If, when I was at the Environment Agency, I had had as low a level of those as the Information Commissioner has had, I would think I had gone to sleep somewhere along the line. Does the Minister acknowledge that improvements need to be made to the Bill to ensure that the new Information Commission has a clear statutory objective and is clearly independent and at arm’s length from government, not the sort of arm’s length that becomes very short in times of crisis, that its regulatory function at a judicial level can be effectively scrutinised, that it retains the office and surveillance camera commission rather than simply wiping them from the script, and that it is able to consider class action complaints brought by civil society organisations or the trade unions?
In my experience, all too often, Governments plural, not just the current Government, establish watchdogs, then act surprised when they bark, and go and buy a muzzle. If the public are to have trust in our digital economy, we need a robust independent watchdog with teeth that government responds to. The Bill will need a lot of work, and there are hours and hours of happy fun in front of us. I look forward to the Minister’s response to my questions and to those of other noble Lords. I also look forward to the maiden speech of the noble Lord, Lord de Clifford.
It is two months since I took my oath in this esteemed Chamber, and every day since I have been grateful to your Lordships for the unique opportunity that has been granted to me. Since that first day, I have been asked on many occasions by friends and colleagues, “How is it going?” My reply: “It is like being back at senior school”. I feel very junior, but that is a nice thing, and I feel quite young too.
Being a new Peer, at times I look around and feel overwhelmed by the wealth of knowledge and depth of experience that your Lordships express in the Chamber and outside. I have been made to feel most welcome and supported, especially today in this debate with your kind word of support, but also by the doorkeepers with their immense knowledge of the workings of the House, its history and keeping me on the right side of its traditions and customs.
I would also like to mention the Convenor of the Cross Benches’ office staff, who have encouraged and guided me to this point, and to the many other staff in the Palace who have made me feel so much part of this grand establishment. Finally, if you will indulge me, thank you to my wife and family, who are here today to support me.
Whenever you start a new opportunity, you always question where you can contribute. For me, it was today’s debate on data protection. It would appear that I do not have in-depth knowledge of this extraordinarily complex subject—but on reflection I do, given my experience over the past 30 years of small business. I started with farming businesses, where I was part of the accountancy team, and then I ran the business side of a small firm of rural chartered surveyors. For the past 15 years I have managed a large independent veterinary practice which provides care and services to pets, horses and a large range of farming businesses. I know how important it is that we understand that the data we hold and care for on behalf of our customers and clients is important.
It is five years since the original GDPR legislation was introduced. At that time, it caused a significant amount of anxiety within the small business and veterinary world. This was reflected in the number of individuals and businesses attending seminars on the GDPR, put on by the Veterinary Practice Management Association, an organisation of which I am proud to be the current president. It promotes management and leadership, which are also a passion of mine, in the veterinary sector. The revision of this Bill is extremely well timed and needed. SME businesses are comfortable with the processes they have in place today to comply with the current legislation, but in the fast-moving and changing IT world, the simplification and clarity in the rules with regard to the use of data on a legitimate basis which this Bill intends to clarify are welcome.
Nearly all small businesses, from sole traders to large owner-managed companies, are data controllers. All collect personal data of some form in sales databases, client and patient relationship software and accountancy packages. The ability of the business to keep control of this data is becoming harder, as it has never been easier to export substantial amounts of data from these systems for many different purposes. Therefore, there is an increased risk that personal data can be lost or stolen due to the ever-increasing threat of cyberattack. It is essential that this updated legislation takes into account where all data is stored and its many different formats and ensures that it is not unknowingly shared with other users.
As my research for this debate has shown me, this Bill is immensely complex, which I know is required—but I fear that its complexity will mean that it will not be fully complied with by a number of small to medium-sized businesses that do not have the resources or time to research and instigate any changes that may be required. Therefore, investment will be needed from government to publicise the changes in a simple and understandable way to SMEs. If the Minister will say how he intends to communicate these changes to the sector, that would be welcome.
With regard to the section on smart data, this has brought immense efficiencies and security for small businesses with the changes made by the banking sector. Extending it further would bring more efficiencies for the business community. A cautious approach is needed when extending the use of smart data to ensure that businesses sharing and receiving personal data are compliant with these complex regulations, so that open application program interfaces cannot be infiltrated or hacked.
Individual personal data has without doubt grown in value significantly over the past five years since the introduction of the original data protection legislation. The desire to exchange of data between businesses, scientific institutions and government will only improve efficiency, productivity and scientific breakthroughs, which is one of the goals of this legislation. The protection of the data and recognising its value is essential as we review the Bill. Potentially, as it currently stands, the Bill could favour large IT corporations, whose ability to collect, process and monetise data is well known, so we must ensure that the new up-to-date regulations do not require large amounts of resources to implement them, so that we can ensure a level playing field for all businesses so that they can benefit from the power of data analysis. I agree with the noble Lord, Lord Allan of Hallam, on the need to access EU data so that small businesses can continue to trade without too much hassle and burden. I look forward to learning more of the way of the House as I continue to contribute to this Bill as it moves to Committee stage.
My Lords, it is a great pleasure to follow my noble friend Lord de Clifford and to congratulate him on an excellent and insightful maiden speech. I am pleased that he has chosen this important Bill for this occasion. Data protection is something of a minority sport and it is great to add another person to the select group in this Chamber.
Data protection is about finding the right balance between protecting individuals’ privacy and the bureaucracy and costs that go with it, for small businesses and others. My noble friend’s long experience in managing small and medium-sized businesses gives him great insight into how these regulations will impact the businesses that typically find it most difficult to deal with greater bureaucracy, as he so rightly pointed out. SMEs are often overlooked more generally, so having such an experienced voice to remind us of their importance during our deliberations will be a great asset to the House, and from a personal point of view it is a great pleasure to welcome a fellow finance professional to join us.
The noble Lord’s experience in the veterinary sector should also be of enormous value to the House. I hope that my noble friend Lord Trees will not mind having his monopolistic position in the field broken. It seems that the noble Lord has also been hiding another light under a bushel: I believe that he has also competed for Great Britain in equestrianism, so he is clearly a man of many talents. I tried to find a joke to do with horsing around, but I am afraid that inspiration completely deserted me. I—and, I am sure, all noble Lords—look forward to his future contributions, both on this Bill and more widely.
I turn now to the specifics of the Bill. As I mentioned, data protection is about finding the right balance between individual privacy and the costs, processes and rules that must be in place, alongside the ability to carry out essential criminal investigations and national security. I think it is generally agreed that the GDPR has its flaws, so an effort to look again at that balance is welcome. There is much in the Bill to like. However, there are a number of areas where the Bill may move the balance too far away from individual privacy, as a number of other noble Lords have already mentioned. In fact, there is not much that I have disagreed with in the speeches so far.
It is a long and very complex Bill; the fact that the excellent Library briefing alone runs to 70 pages says a lot. It will not be possible to raise all issues; noble Lords are probably grateful for that. I am going to concentrate on four areas where I can see significant risks, but the Minister should not take that as meaning that I disagree with other things that have been said so far; I agree with almost everything that has been raised.
First, a general concern raised a number of times, in particular by the noble Lord, Lord Allan, is that the Bill moves us significantly away from our existing data protection rules, which were based clearly on the EU regulations. We are currently benefiting from an EU data adequacy ruling which allows data to be transferred freely between the EU and the UK. This was a major concern at the time of the Brexit discussions. At that time, data adequacy was not a given. This ruling comes to an end in July 2025, but it can be ended sooner if the EU considers that our data protection rules have diverged too far.
The impact assessment for the Bill—another inch-thick document—says:
“Cross-border data transfers are a key facilitator of international trade, particularly for digitised services. Transfers underpin business transactions and financial flows. They also help streamline supply chain management and allow business to scale and trade globally”.
It is good that the impact assessment recognises that. The loss of data adequacy would therefore have significant negative impacts on trade and on the costs of doing business. Without it, alternative and more costly methods of transferring data would be required, such as standard contractual clauses. There are also implications for investment, as the noble Lord, Lord Allan, pointed out. Large international financial services organisations would be much less likely to establish data processing activities in the UK if we were to lose data adequacy. Indeed, they may decide that it is worth moving their facilities away from here.
The impact assessment suggests surprisingly low costs that might arise: one-off costs of £190 million to £460 million, and annual lost trade of £210 million to £420 million. However, these are only the direct reduction in trade with the EU; as the impact assessment points out, they will likely be larger when taking into account interactions with onward supply chains.
The impact assessment does not judge the probability of losing the data adequacy status. I find that rather extraordinary, possibly even shocking, as it is so important. The New Economics Foundation and UCL conservatively estimate the cost of losing data adequacy at £1 billion to £1.6 billion; however you look at it, these are very large numbers.
What can the Minister tell us that could set our minds at rest? What discussions have taken place with the EU? What initial indications have been received? What changes have been made to the original draft Bill to take account of concerns raised by the EU around data adequacy? What is the Government’s assessment of this risk? The Bill has been on the blocks for a long time now. I have to assume that a responsible Government must have had discussions with the EU around data adequacy in relation to these proposals.
Secondly, as we have heard, Clause 129 would enable Ofcom to require social media companies to retain information in connection with an investigation by a coroner into the death of a child, where the child was suspected to have died by suicide. This is a welcome addition but, as we have heard, it does not go far enough. It does not include all situations where a death was potentially related to online activity; for example, online grooming. My noble friend Lady Kidron has, as always, covered this with much greater eloquence than I could. I suspect the Minister already knows that the Government have got this wrong. As the noble Lord, Lord Knight, pointed out, it would be a brave Minister who tried to hold the current line in the face of opposition from my noble friend. I welcome the words that the Minister said at the beginning of this debate—that he is willing to engage on this matter. I hope that engagement will be constructive.
Thirdly, the Bill introduces draconian rules that would enable the DWP to access welfare recipients’ personal data by requiring banks and building societies to conduct mass monitoring without any reasonable grounds for suspecting fraudulent activity. As the noble Baroness, Lady Young, pointed out, this includes anyone receiving any kind of benefit, including low-risk benefits such as state pensions, so, as she has pointed out, most noble Lords will be subject to this potential intrusion into their privacy—although, fortunately, not me yet. The Government argue that this power is required to reduce levels of benefit fraud. My enthusiasm to tackle fraud is well known, but the Government already have powers to require information where they have grounds to suspect fraudulent behaviour. This new power, effectively enabling them to trawl any bank account with no grounds at all, is a step too far, and constitutes a worrying level of creep towards a surveillance society.
That brings me neatly on to my fourth concern, which the noble Lord, Lord Kamall, raised earlier. The Bill will abolish the post of Biometric and Surveillance Camera Commissioner—currently it is one person—as well as the surveillance camera code. It was interesting that the Minister did not mention this in his opening speech. It is extremely important.
The Government argue that these functions are covered elsewhere or would be moved elsewhere—for example, to the ICO—but that does not seem to be the case. An independent report by the Centre for Research into Information, Surveillance and Privacy, commissioned by the outgoing commissioner, sets out a whole range of areas in which there will be serious gaps in the oversight of handling biometric data and, in particular, the use of surveillance cameras, including facial recognition.
The independent report concludes that none of the Government’s arguments that the functions are adequately covered elsewhere “bear robust scrutiny”. It notes in particular that the claim that the Information Commissioner’s Office will unproblematically take on many BSCC functions mistakes surveillance as a purely data protection matter and thereby limits
“recognition of potential surveillance-related harms”.
Given the ever-widening use of surveillance in this country, including live and retrospective facial recognition, and the myriad other methods of non-facial recognition being developed, such as gait recognition or, as I was reading about this morning, laser-based cardiac recognition—it can read your heartbeat through your clothing—alongside the ability to process and retain ever greater amounts of data and the emerging technology of AI, having clear rules on and oversight of biometrics and surveillance is more important than ever. We see how the misuse of surveillance can go—just look at China. Imagine, for example, if this technology, unfettered, had been available when homosexuality was illegal. Why do the Government want to remove the existing safeguards? With the advances in technology, surely these are more important than ever. We should be strengthening safeguards, not removing them.
The outgoing commissioner—if the Government get their way, the last surveillance camera commissioner —Professor Sampson, put it best:
“There is no question that AI-driven biometric surveillance can be intrusive, and that the line between what is private and public surveillance is becoming increasingly blurred. The technology is among us already and the speed of change is dizzying with powerful capabilities evolving and combining in novel and challenging ways … The planned loss of the surveillance camera code is a good example of what will be lost if nothing is done. It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general. It’s one of those things that would have to be invented it didn’t already exist, so it seems absolutely senseless to destroy it now, junking the years of hard work it took to get it established”.
These are just four of the areas of concern in the Bill. There are many more, as we have heard. In the other place, following the failure of the recommittal Motion after all the new amendments were dropped in at the last minute, David Davis MP said that the Commons had
“in effect delegated large parts of the work on this important Bill to the House of Lords”.—[Official Report, Commons, 29/11/23; col. 888.]
That is our job, and I believe that we do it well. I hope the Minister will engage constructively with the very genuine concerns that have been raised. We must get this Bill right. If we do not, we risk substantial damage to the economy, businesses, individuals’ privacy rights—especially children—and even, as far as the surveillance elements go, to our status as a free and open democratic society.
My Lords, I have now reached the grand old age of 71, and it is a worrying fact that I think this puts me bang on the average age of those in your Lordships’ House. So, it is a huge relief to be able to welcome to this House the two Peers, such young Peers, who have preceded me. I echo what the noble Lord, Lord Vaux, said, and I find myself in agreement with him, in that I have agreed with most of what has been said in this debate so far. I also echo his welcome to the noble Lord, Lord de Clifford, who brings real front-line experience of the effects of what we do in this House on small and medium-sized enterprises. He is someone that I know noble Lords will want to hear from in the years to come—and in view of his age, we can look forward to very many of them.
I declare my interest as chairman of the advisory panel of Thales, a digital company, and a member of the Post Office Horizon Compensation Advisory Board. I have learned in relation to the Post Office scandal that the complexity of computers is such that nobody really fully understands exactly what programs will do, so it is absurd that there is still in law a presumption that computers will operate as they are intended to. I hope that noble Lords will be able to turn their minds to changing that in the relatively near future.
I can be brief, because I was intending to raise issues relating to privacy, cookies and information which have already been so well canvassed by my noble friend Lord Kamall. Currently, we have to consent to cookies and terms and conditions, but we do not read them, we do not understand them, we do not know their effect—we do not have time. We will do anything for convenience, so the consent that we give is neither informed nor freely given. My noble friend Lord Kamall said what I wanted to say about an open electoral register. The thought of sending paper letters to everyone to inform them about the use of their data seems disproportionate and I, too, would like to know what on earth the ICO is thinking of in demanding such notification to everybody in the Experian case. I also adopt his questions about exemptions from getting consent to cookies when they are purely functional and non-intrusive. But there is no need for me to say it again, so I will not.
My Lords, on behalf of these Benches, I too welcome the noble Lord, Lord de Clifford. I pay tribute to his maiden speech and thank him for his insightful and valuable contribution to this debate. I also look forward to many future occasions on which he will contribute to the work of this House.
As the right reverend Prelate the Bishop of St Albans has said, we on these Benches recognise that high-quality data is crucial to creating and sustaining a healthy and efficient society. However, it is vital to get the balance right between ownership, access, control, and legitimate use of that data. Human flourishing should be at the front of regulating how data is used and reused. As we said in our written response to the Government’s 2020 data consultation:
“Fundamentally, the church welcomes any technology that augments human dignity and worth, while staunchly resisting any application of data that undermines that dignity. Questions of efficiency and cost-effectiveness are subsidiary to questions about how the types and uses of data will promote human flourishing in society and best practice in public bodies”.
It seems that the real test of this legislation is how it will truly promote good democracy and the extent to which it will protect the safety and enhance the security of the most vulnerable in our society. I hope the House will permit me a brief seasonal reference in pointing out that it was, in fact, a comprehensive data collection exercise by Quirinius, motivated entirely by greed and an abuse of power, that first resulted in the Holy Family travelling to Bethlehem. It also meant that they would need to flee very quickly indeed when the Christ child’s identity and location came to the attention of an insecure leader with unregulated power who also had exclusive access to the data, albeit in a very ancient form.
We acknowledge that current provision for data regulation is also outdated and in urgent need of reform. We support the Government’s intention to reform the Information Commissioner’s Office while preserving its independent footing, and the introduction of an information commission. But it is interesting to compare the Bill before us today with the concerns we expressed in 2020. First, our goal then was
“to flag some of the more significant risks we foresee in using data without adequate reflection on the pitfalls and harms that hasty and ill-considered data use gives rise to”.
It is sobering, therefore, that the Bill arrives in this House substantially amended in ways the other place has had insufficient time to scrutinise. The Online Safety Act perhaps offers a valuable and recent template for how this House might examine and improve this important Bill.
Secondly, we said we acknowledged the benefits of data but also the importance of gaining and retaining public trust. Therefore, it is worrying that, with some of the measures in the Bill, the Government seem to be reducing the levers and mechanisms that public trust depends upon. The Public Law Project’s assessment is that:
“While the Bill does not outright remove any of the current protections in data protection law, it weakens many of them to the extent that they will struggle to achieve their original purposes”.
We share the concerns of many civil society groups that the Bill will reduce transparency by weakening the scope of subject access requests, although I welcome the concern to mitigate plainly vexatious complaints. In June, the chief executive of the Data Protection Officer Centre said:
“Whilst countries across the globe are implementing ever-more robust data protection legislation, the UK seems intent on going in the opposite direction and lowering standards”.
What reassurance can the Minister give the House that the Bill will retain public trust and will not diverge even from current adequacy agreements?
Thirdly, we emphasised the Nolan principles as an aid to the public use of data. On 6 December 2023, the Public Accounts Committee in the other place published a report that noted that the DWP is piloting the use of machine-learning algorithms to identify potentially fraudulent claims. We are all in favour of proportional and effective measures to counter fraud, but Big Brother Watch argues that it is
“wholly inappropriate for the UK Government to order private banks, building societies and other financial services to conduct mass, algorithmic, suspicionless surveillance and reporting of their account holders on behalf of the state”.
Will the Minister explain how the state demanding data without cause—including, as a number of Members pointed out, data on the bank accounts of recipients of the state pension that it itself says it has no intention of using—complies with the Nolan principles of openness and accountability? Is this not at risk of being an overreach of government into people’s private lives?
His Majesty’s Government made commitments at the recent AI Safety Summit to make the UK a world leader in safe and responsible AI innovation, so would we not expect that the Data Protection and Digital Information Bill would provide oversight of biometric technologies and general purpose artificial intelligence? My colleague the right reverend Prelate the Bishop of Oxford regrets that he is unable to participate in the debate today, but he will again lead for us as we scrutinise the Bill more thoroughly, including its gaps in protecting children’s data and in the regulation of data use by AI foundation or frontier models.
Regarding the latter, an important failure to interlock regulation persists. As the BBC reported over the weekend, assurances given in this House during the passage of the Online Safety Act are being threatened. The draft amendment grants access to data only where children have taken their own lives. This is not what the Government promised on the record in either the Commons or the Lords, and we will continue to press for a proper resolution. Surely we cannot simply rely on other holders of important data to disclose information that is important in order to protect children’s well-being.
I will comment briefly on death registration. The ability to move from a paper to an electronic register is commended. However, the UK Commission on Bereavement, chaired by my colleague the right reverend Prelate the Bishop of London, has recommended more that could be done to reduce the administrative burden on bereaved people. The Tell Us Once system is designed so that someone reporting a death need do so only once, and the information is then shared with the relevant public services. Currently, bereaved people must still notify private companies of a death separately. Can the Government please review the system to see whether this burden could be lessened? I would be grateful if the Minister could clarify how the extended priority service register announced in the Autumn Statement will work alongside Tell Us Once. In addition, do the Government have any plans to undertake an updated equality impact assessment of Tell Us Once, given that the last one was 12 years ago?
We look forward to working with everyone in this House to carefully understand and, where appropriate, strengthen an important Bill for the future flourishing of the country and the well-being of all.
My Lords, I very much welcome the maiden speech of the noble Lord, Lord de Clifford. As one who entered this House in his early 50s, I can recommend that coming in here, just as the mid-life crisis starts to bite, and being, as I was then, Young Tom again, is a great boost to the morale.
I associate myself with the advice given by the right reverent prelate the Bishop of Southwell and Nottingham. At the end of the recent passage of the Online Safety Bill, there was general thanks to the noble Lord, Lord Parkinson of Whitley Bay, the Minister guiding the Bill safely through the Lords, for his willingness to listen to argument and to amend where necessary. I fear that the noble Viscount will hit some choppy water in this House unless he adopts a similar attitude, and he should certainly take the noble Baroness, Lady Kidron, very seriously concerning children’s data rights.
The Government’s declared intention of reducing burdens on organisations while maintaining high data protection standards has met with scepticism and outright criticism from a wide range of industry bodies, civil society organisations and individuals with expertise in this area. As has been said, the Official Opposition in the other place asked that the Bill be recommitted to a Public Bill Committee for further scrutiny, but this was refused. As the noble Baroness, Lady Young, indicated, this has put further onus on this House to make sure there is time to listen to and examine the wide range of criticisms and amendments seeking to improve the Bill.
In 2010, I became Minister of State at the Ministry of Justice. Among my responsibilities was the ICO and the early negotiations on what became the GDPR. One of my first roles was to go to a facility south of the river to look at our skills in this area. After looking at a number of things, I asked the government official who was showing me the facility whether there were any human rights or privacy issues involved. He said, “Oh no, sir. Tesco knows more about you than we do”. There is a certain profligacy by the individual about their data, along with real concern about their privacy. It is riding those two horses at once that is going to be the challenge of this Bill. I oppose the Bill with an eye to ensuring, like the noble Baroness, Lady Young, that the ICO is well served by this legislation and continues in setting standards and protecting individuals.
Prior to Brexit, I was on one of your Lordship’s sub-committees, where we constantly pressed the Ministers about data adequacy with the EU on our departure. The answers then were very much along the lines of, “Well, it’ll be alright on the night”. I hope that the Minister will again reassure us in his wind up that the data protection legislation in the Bill clarifies the law without deviating from the principles set out in GDPR. The UK’s data adequacy status, granted by the European Commission, is important, and we do not want to see that jeopardised in pursuit of some mythical benefits from Brexit.
I am sorry that my noble friend Lord Allan will not be joining us for the rest of this; I would have valued his contribution. But I will keep an eye on it, as a number of other colleagues have indicated.
More widely, one of the problems with this Bill is that its scale and how it has been dealt with by the Government in its preparation, false starts and in the other place mean that we are going to legislate for myriad issues, each of which are of importance to the sector, the individual concerned or society and will require our full due care and attention. For example, new powers in Clause 87 and 88, which allow the Secretary of State to offer an exemption for direct marketing provisions used for the purpose of democratic engagement, may invite abuse. I put that mildly. This morning’s FT contains an article raising precisely these fears and this issue must be examined in detail during the passage of the Bill.
One issue that I was going to deal with in detail was referred to by the noble Lord, Lord Kamall. The Minister might, even at this early stage in the Bill’s progress, provide clarification about the use of the open electoral register for direct marketing purposes. This issue has also been raised with me by the Data & Marketing Association. As the noble Lord, Lord Kamall, explained, there are big concerns in the market about what companies can do with personal data from the open electoral register and this needs to be resolved.
Unfortunately, considerable market uncertainty has been caused by the enforcement notice by the ICO, which has already been referred to. In the light of all this legal and market uncertainty, and given that this Bill is before the House, the best and most timely option is to address the issue in the Bill and I urge the Government to consider what can be done on this. Perhaps the noble Lord, Lord Kamall, and other noble Lords could discuss a joint amendment.
That is just one example of the issues in the Bill that will require detailed examination and close attention. Much of it will be practical and will involve building a framework that brings within it the framework of law and regulation to keep pace with the new technologies that are now part of the digital and data revolution. In this, the impact of AI will cast a long shadow over our deliberations, as the noble Lord, Lord Knight, the noble Baronesses, Lady Kidron and Lady Young, and others have made clear.
The right reverend Prelate the Bishop of St Albans referred to the benefits of the wide-ranging briefings that we received prior to today’s debate. Let me assure the authors that none of them will go to waste as we move into Committee. As well as dealing with the mundane and the practical, we have to take seriously the advice contained in one briefing, which read:
“At a time of advancing AI-driven surveillance, and when public concerns over measures such as facial recognition technology are heightened, removing oversight and accountability could have serious implications for public trust in policing”.
This warning could apply to almost any sector, service or industry covered by the Bill. Two quotes leap out to me from the excellent Lords Library briefing on the Bill, which has been referred to. One comes from the Information Commissioner, who calls for a regulator that is “trusted, fair and independent”, and the other comes from techUK, which calls for a Bill that will
“help spur competition and innovation in the market, whilst empowering consumers and delivering better outcomes”.
Riding those two horses at once is now the task before us.
My Lords, I join others in welcoming the noble Lord, Lord de Clifford, to this House. I look forward to hearing him in future debates.
This Bill is a large Bill, written in an utterly arcane language which normal people will struggle to understand and follow. Hopefully, the Government will try to write Bills in a better way, otherwise it is hard for people to understand the laws and follow them. I have grave misgivings about some parts of this Bill and I will touch on a couple of these issues, which have already been identified by a number of noble Lords.
George Orwell’s iconic novel Nineteen Eighty-Four, published in 1949, raised the spectre of Big Brother. That nightmare has now been brought to reality by a Conservative Government supposedly rolling back the state. The Government have already undermined the people’s right to protest and to withdraw labour. Now comes snooping and 24/7 surveillance of the bank, building society and other accounts of the sick, disabled, poor, elderly and unfortunate, all without a court order. Over 22.4 million people would be targeted by that surveillance, but the account holders will not be told anything about the frequency and depth of this organised snooping.
In true Orwellian doublespeak, the Government claim that the Bill will
“allow the country to realise new post-Brexit freedoms”.
They link the surveillance to, and are stirring up, people’s fears about benefit fraud, while there is absolutely no surveillance of those receiving public subsidies, those mis-selling financial products, those accused of PPE fraud or even a former Chancellor who abused the tax system. Numerous court judgments have condemned the big accounting firms for selling illegal tax-dodge schemes and robbing the public purse, but despite those judgments no major accounting firm has, under this Government, ever been investigated, fined or prosecuted. None of the accounts of those partners or firms is under surveillance. The Bill is part of a class war: it targets only low-income and middle-income people, while big beasts get government contracts.
Currently, the Department for Work and Pensions can request details of bank accounts and transactions on a case-by-case basis on suspicion of fraudulent activity, but Clause 128 and Schedule 11 give the Government unrestrained powers to snoop. The Government say that the Bill
“would allow regular checks to be carried out on the bank accounts held by benefit claimants to spot increases in their savings which push them over the benefit eligibility threshold, or when people spend more time overseas than the benefit rules allow for. This will help identify fraud”
and
“take action more quickly”.
How prevalent is the benefit fraud that the Government wish to tackle? The Government estimate that, in 2023, they lost £8.3 billion to welfare fraud and errors, 80% of which is attributed to fraud. A government statement issued on 23 November said that, as a result of mass surveillance, benefit fraud would save the public purse
“£600 million over the next five years”.
On 29 November, in a debate in the other place, the Minister mentioned the figure of £500 million and, despite a number of challenges, did not correct that estimate. The Government are hoping that mass snooping will generate savings of £100 million to £120 million a year, but we do not have a breakdown of this saving and do not know how they have arrived at that number. I hope that the number is more reliable than the Government’s estimates of the HS2 costs. To put this into context, the Government are spending nearly £1,200 billion this year and they are introducing snooping to save about £100 million a year.
The snooping of bank accounts suggests that the Government are looking for unusual cash-flow patterns. What that means is that, if anyone gives a lump sum to a loved one for Christmas, a birthday, a holiday or home repairs, and it passes through their bank account, the Government could seize on that as evidence of excess resources and reduce or stop their benefits. Suppose that a poor person pawns some household items for a few pounds and temporarily boosts his or her bank balance. Would that person now be labelled a fraudster and lose benefits? The Government have not looked at the details of what would happen.
Many retirees have a joint bank account with another member of the family or with a friend. Under the Government’s crazy plans, the third party would also be put under surveillance because they happen to have a joint account. Can the Minister explain why people not receiving any social security benefits are to be snooped upon, because they would be caught in this trap?
How will the snoopers distinguish temporary and easily explainable boosts in bank balances from others? My background is that I am an accountant and I have investigated things over the years; I helped the Work and Pensions Committee investigate the collapses of BHS and Carillion. So I hope that the Minister can enlighten me on how all this will be done.
I hope that the Minister can also clarify the scope of the Bill as it applies to recipients of the state pension. The Government have classified it as a benefit, so can the Minister explain why? After all, the amount one gets is determined by the number of years of national insurance contributions. So why is it actually a benefit? The Minister in the other place said:
“I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future”.—[Official Report, Commons, 29/11/23; col. 912.]
Why do the Government want to snoop on the bank accounts of OAPs when there is hardly any fraud? Do they have some sinister plan to treat the state pension as a means-tested benefit? Perhaps the Minister could confirm or deny that. If he wishes to deny it, can he explain why the Government are targeting retirees? What have they done?
In this House, we have more than our fair share of senior citizens who receive a state pension, and their bank accounts would also be under surveillance. How long before a Government abuse that information to blackmail Members of this House and erode possibilities of scrutinising the Government of the day? It is opening us all up to blackmail, now or in the future.
In the past, the Government assured us that health data would not be sold—but then sold it to corporations, as we heard earlier. How can we trust the Government not to do the same with data collected via snooping on bank accounts? What will they be selling?
The mass surveillance is not subject to any court order. Concerned citizens will not be told, as their right to know will be further eroded by Clause 9. It is for the courts, not Ministers, to decide whether requests for data are vexatious or excessive. Can the Minister provide us with some data on how many requests for information are received by departments each year and what proportion have been declared to be vexatious and excessive by the courts? The Government cannot just say that they are vexatious—I would rather trust the courts.
Clause 9 obstructs government accountability and further erodes the Nolan principles. As a personal example, I fought a five and a half-year battle against the Treasury to learn about the closure of the Bank of Credit and Commerce International in 1991. It was the biggest banking fraud of the 20th century, which has yet to be investigated. I asked the Treasury for some information and was totally fobbed off. I went to the Information Commissioner, who sided with the Treasury. So I went to the courts to get some information, with the possibility that the judges might declare my attempts to learn the truth vexatious and might even impose legal costs on me. Fortunately, that did not happen—I won the case and the Treasury had to release some documents to me.
The information showed that the Conservative Government were covering up money laundering, frauds, the secret funding of al-Qaeda, Saudi intelligence, arms smugglers, murderers and others. The information given to me has never been put on public record by this Government. Can you imagine what will happen now if quests to learn something about banking fraud are simply labelled vexatious and excessive? How will we hold the Government to account? The Bill makes it harder to shine some light on the secret state and I urge the Government to rethink Clause 9.
Finally, I urge the Minister to answer the questions I have raised, so that we can have a better Bill.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. I very much share his concerns about the Government prying into the bank accounts of benefit recipients and pensioners. This is a historic moment, for all the wrong reasons, with the Government looking to pry through the private lives of millions of people, with no evidence that it is in any way necessary. The biggest problem with benefits, of course, is the large amount of money that is left unclaimed or unpaid, due to errors made by the Department for Work and Pensions.
I will also pick up the noble Lord’s point about economic crime. I note that this happens to be the week that, in a Frankfurt court, the former global head of tax at Freshfields Bruckhaus Deringer acknowledged in his testimony that he had
“glossed over the fact that my legal advice was used for illegal means”.
This was a man who, until 2019, was earning €1.9 million a year.
I have a direct question for the Minister. The Government have talked a great deal about the DWP and their plans in that area. What does the Bill do to tackle economic crime, given that the head of UK Finance described the UK as
“the fraud capital of the world”
and that we have an enormous problem with enablers, down the road in the City of London, who we know are getting around sanctions from the UK Government and others, swishing so much dirty money through London that it is now known as the “London Laundromat”? What does the Bill do on these issues?
I will tick off some points of agreement and concern from previous speeches. The Minister spoke of
“the highest standards of data protection”.
From what I recollect of the Minister’s speech, there was a surprising lack of the Government’s favourite word, “world-leading”. What does it mean if these data protections are not world-leading?
The Minister also said the Bill was “codesigned all the way”. A number of noble Lords pointed to the 260 amendments on Report at the other place. That really does not look like a codesigning process. The benefit of working across many Bills is that this Bill reminds me—and not in a good way—of the Procurement Bill, where your Lordships’ House saw a similar deluge of government amendments and had to try to disentangle the mess. I fear that we are in the same position with this Bill.
I pick up the speech of the noble Baroness, Lady Kidron —spectacularly excellent, as always—and her points about edtech and the situation with technology and education systems, and the utter impossibility of teachers, nursery nurses or people in similar positions dredging through the fine detail of every app they might want to use to ensure that their charges are protected. That is obviously not a viable situation. There have to be strong, protective general standards, particularly for apps aimed at children. The Government have to be able to guarantee that those nursery nurses and teachers can just pick up something—“It’s approved, it’s okay”—and use it.
I will also pick up the points that the noble Baroness, Lady Kidron, made about the importance of data being available to be used for the public good. She referred to research, but I would like—and I invite NGOs that are interested—to think about community uses. I was recently with the National Association of Local Councils, of which I declare that I am a vice-president, in Shropshire, where we saw parish and town councils doing amazing work to institute climate action. I am talking about small villages where data protection is not really an issue, as everyone knows everything about everybody. But we might think of a suburb of Liverpool or a market town, where people do not have the same personal knowledge of each other but where a council or community group could access data for good reasons. How can we make it possible to use these tools for positive purposes?
Briefly picking up on the points made by the noble Lord, Lord Allan—another of our experts—I echo his stress on the importance of EU equivalency. We have dumped our small businesses, in particular, in the economic mire again and again through the whole process of Brexit. There is a reason why #brexitreality trends regularly. We have also dumped many of our citizens and residents in that situation. We really must not do it again in the technology field.
I have a couple of what I believe to be original points. I want to address specifically Clauses 28 and 30, and I acknowledge here a briefing from Rights and Security International. It notes that that these clauses enable the Government to grant an opt-out to police forces from having to comply with many of the data protection requirements when they are working with the intelligence services. For example, they could grant police immunity from handling personal data unlawfully and reduce people’s right of access to their personal data held by the authorities.
In the Commons, the Minister said these provisions would be “helpful” and “efficient”. I put it to your Lordships’ House that to interfere with rights such as these, at the very least the Government should claim, to have any justification, that they are “proportionate” and “necessary”. That is an area that I suspect my noble friend Lady Jones of Moulsecoomb will pick up in Committee. There are also issues raised by the Ada Lovelace Institute and by other noble Lord, about the oversight of biometric technologies, including live facial recognition systems, emotion detection and the foundation models that underlie apps such as ChatGPT. These already limited legal safeguards are being further undermined by the Bill, at a point when there is general acknowledgement in the community that we should be heading in the opposite direction. I think we all acknowledge that this a fast-moving area, but the Government are already very clearly behind.
There are two more areas that I particularly want to pick up. One is elections. There has only just started to be focus on this. The Bill would allow the Government to tear up long-standing campaign rules with new exemptions. Now we have safeguards against direct marketing. These are being removed and,
“for the purposes of democratic engagement”,
anyone from 14 years and above can be targeted. I feel like warning the Government: my experience with young people is that the more they see of the Government, the less they like them, so they might want to think about what messages they send them. Seriously, I note that the Information Commissioner’s Office said during the public consultation on the Bill—and we can really hear the bureaucratic speak here—
“This is an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
The discussion of the Bill has reflected how this could put us in a situation where our elections are even more like those in the United States of America, which is of course no recommendation at all with the place of big money in their politics. I note that we really need to link this with the Government’s recent decision to massively increase election spending limits. Put those two things together and I suggest that is a real threat to what limited democracy we already have left in this country.
There is a further area which I am not going to go into in great detail, given the hour and the day, but which I will probably come back to in Committee. There is an extensive briefing, which I am sure many have seen from Understanding Patient Data. It is really important how the Bill comes up with a different definition of identifiable data. In the health sector, it is very common to use pseudonymous information from which key bits are removed, but it is still quite possible to go backwards and identify an individual from their data because they have an extremely rare disease and they live in this area of the country, or something like that.
This new Bill has, instead, more of a subjective test; the definition seems to rely on the judgment of the data controller and what they know. If the Minister has not looked at the briefing from Understanding Patient Data, I really urge him to because there are concerns here and we already have very grave concern in our community about the use of medical data, the possible loss of anonymity, and the reuse of data for commercial research. We are, again, coming to an Americanisation of our health system.
I conclude by saying that we have an enormous amount of work to do here in your Lordships’ House; I am trying not to let my head sink quietly on to the Bench in front of me, but we are going to have a break first, of course. I say to all noble Lords and—echoing the comments earlier—the many members of staff who support us by working so hard and often so late: thank you very much and Merry Christmas all.
My Lords, it is a pleasure to take part on Second Reading; I declare my interests in financial services and technology, in Ecospend Ltd and Boston Ltd. There is a fundamental truth at the heart of our deliberations, both on Second Reading and as we progress to Committee: that is it is our data. There are no great large language models; perhaps it would be more appropriate to call them large data models —maybe then they would be more easily and quickly understood by more people. Ultimately, our data is going into AI for potentially positive and transformational purposes but only if there is consent, understanding, trustworthiness and a real connection between the purpose to which the AI is being put and those of us whose data is being put into the AI.
I am going to focus on four areas: one is data adequacy, which has already, understandably, been heavily mentioned; then AI, smart data and digital ID. I can probably compress everything I was going to say on the first subject by simply asking my noble friend the Minister: how will the Bill assure adequacy between the UK and the EU? It is quite a large Bill— as other noble Lords have commented—yet it still has a number of gaps that I am sure we will all be keen to fully fill in when we return in 2024. As already mentioned, AI is nothing without data, so what checks are being put in place for many of the suggestions throughout the Bill where AI is used to interrogate individuals’ data? Would it not be absolutely appropriate for there to be effective, clear, transparent labelling across all AI uses, not least in the public sector but across all public and private sector uses? Saying this almost feels like going off track from the Bill into AI considerations, but it seems impossible to consider the Bill without seeing how it is inextricably linked to AI and the pro-innovation AI White Paper published earlier this year. Does the Minister not agree? How much line-by-line analysis has been done of the Bill to ensure that there is coherence across the Government’s ambitions for AI and what is currently set out in this Bill?
On smart data, there are clearly extraordinary opportunities but they are not inevitabilities. To consider just one sector, the energy sector, to be able potentially to deploy customers’ data in real time—through their smart meters, for example—with potential to auto-shift in real time to the cheapest tariff, could be extraordinarily positive. But again, that is only if there is an understanding of how the consent mechanisms will work and how each citizen is enabled to understand that it is their data. There are potentially huge opportunities, not least to do something significant about the poverty premium, where all too often those who find themselves with the least are forced to pay the most, often for essential services such as energy. What are the Government doing in terms of looking at additional sectors for smart data deployment? What areas are the state activities? What areas of previous state activity are being considered for the deployment of smart data? What stage is that analysis at?
On digital ID, about which I have spoken a lot over previous years, again there are huge opportunities and possibilities. I welcome what is in the Bill around the potential use of digital ID in property transactions. This could be an extraordinarily positive development. What other areas are being looked at for potential digital ID usage? What stage is that analysis at? Also, is what is set out in the Bill coherent with other government work in other departments on digital ID? It seems that a lot has been done and there have been a number of efforts from various Administrations on digital ID, but we are yet to realise the prize it could bring.
I will ask my noble friend some questions in conclusion. First, how will the introduction of the SRI improve things compared with the data protection officer? Again, how will that impact on issues such as, but not limited to, adequacy? Similarly, linking back to artificial intelligence, a key principle—though not foolproof by any measure and certainly not a silver bullet, but important none the less—is the human in the loop. The Bill is currently some way short of a clear, effective definition and exposition of how meaningful human intervention, human involvement and human oversight will work where autonomous systems are at play. What are the Government’s plans to address that significant gap in the Bill as currently drafted?
I end where I began, with the simple truth that it is our data. Data has been described in various terms, not least as the new oil, but that definition gets us nowhere. It is so much more profound than that. Ultimately it is part of us and, when it is put together in combination, it gets so close to giving such a detailed, personal and almost complete picture of us—ultimately the digital twin, if you will. Are the Government content that the Bill does everything to respect and fully understand the need for everything to be seen as trustworthy, to be understood in terms of it being our data and our decision, and that we decide what data to deploy, for what purpose, to whom and for what time period? It is our data.
My Lords, it is a real privilege to follow the noble Lord, Lord Holmes. I hope that the Government will learn from his wisdom. I congratulate the noble Lord, Lord de Clifford; I am glad that his family was here to witness his powerful contribution.
I support the Government’s laudable aim to make the UK the most innovative society in the world of science and technology. I wish to record my gratitude for the many briefings provided to us by the Library, the 5Rights Foundation, Big Brother Watch, CRISP and Marie Curie, among many other notable organisations and individuals. The Government make sweeping assurances that this legislation is based on their commitment to all citizens enjoying access to a fair, inclusive and trustworthy digital environment. It is grounded in the hopes that an algorithmic system would be designed to protect people from harm and from unsafe, unaccountable surveillance, with public involvement in its development, ensuring adequate safeguards as well as improved skills and information literacy. That really describes the mouthful of different aspects of the Bill.
Every part of our life is determined by some form of digitalisation, not least via our devices; they are an ever-present reminder, if any were required, of the interconnectedness of our existence at home and across the globe. We are living through an exponential rise in social media information alongside the extraordinary growth of technologies’ surveillance capacity. It is sometimes impossible to differentiate truth and reality in the mass of content across multiple platforms, and thus an assurance of public safeguarding within fast-moving technologies may not be achievable quite as easily as the Government suggest. Experts are consistently warning us of yet uncharted harm in the advent of AI-driven technology, causing legitimate concerns for civic society organisations.
So where does an ordinary citizen turn to if they get caught up in some of the Bill’s punitive measures? As has been stated by noble Lords, contradicting progress made in this House, the Bill will provide the Government with yet more unprecedented powers, which will evidently result in the limiting of and infringement on citizens’ rights to privacy; this was detailed powerfully by the noble Lord, Lord Sikka. The Bill is complex and has a broad spectrum of remits that will impact every aspect of our lives: in the home, at work and outside. Time will not permit us all to consider adequately the questions and concerns raised by many well-respected organisations; we in this Parliament are therefore obliged to all our citizens to ensure that our legislation is not immune to proper scrutiny.
Noteworthy parts of the Bill that cause concern include those regarding the safeguarding of children’s well-being. I thank the 5Rights Foundation for its briefing and agree that the Bill’s proposed changes to the UK’s data protection regime risk eroding the high level of privacy that children currently have a right to, making them less safe online. My noble friend Lady Kidron raised these matters thoroughly with her usual expertise; I absolutely agree that children’s safety cannot be designed to maximise economic benefits and add my voice to her call that the Government must keep their promise to bereaved families and ensure that children continue to be given heightened levels of data protection.
This leads me on to the matter of data collection, including how data is stored and shared with external organisations. In this context, there must be absolute commitment from the Government to preserving the integrity of our personal data. Without informed consent, organisations and institutions should not and cannot be allowed to access personal information by assuming consent.
We know that huge datasets are gathered by law enforcement, our NHS, welfare and financial services, alongside local authorities for voter registration purposes. The Data and Marketing Association, representing around 700 companies, including charities and commercial brands, suggests that Clauses 114 and 115, on the new exemption for direct marketing used for democratic engagement, could be open to abuse. While recognising that an open electoral register has been an important resource for business and charities for verification of addresses, it has also been used for direct business marketing, as has been stated.
Any amendments to this aspect of the Bill must not be on the assumption that, if a person does not opt out, she or he is fully cognisant of giving automatic consent for data sharing. I do not accept that this is well known to and understood by the millions of elderly and vulnerable people who do not opt out, or that they do so knowingly. It is our duty to empower all citizens, not just those who can readily access and are confident in this rapidly expanding digital environment. Noting the Minister’s comment on co-designing the Bill with stakeholders, will he give an assurance that partners included advocacy and civil rights organisations?
Clause 9 would give public authorities and other bodies wider scope to refuse subject access requests, making it more difficult for people to find out what information about them is being held and how it is being used. Clause 9 should not have a place in this legislation.
Clause 20 would water down requirements to carry out a proper impact assessment. This means that in many cases, organisations and businesses, including local authorities processing data, will not have to fully consider whether data processing is necessary and proportional. Clause 20 should also be removed from this Bill.
I hope to see us strengthening Clauses 110, 113 and 116 providing greater protection to consent and safeguarding consumers. Whatever the final impact of the legislation, many public and corporate institutions already hold a ginormous amount of digital materials. As someone with years of local authority experience, I can say that safeguarding paper files seems like an alternate universe. All the protocols were written on every manager’s file, and any breaches or failures could have landed any one of us in court. I cannot comprehend all the protocol that may be required to protect individual data under this Bill. How will the Government monitor whether protocols issued as a result of this legislation are actually being adhered to? Who will be held accountable for the anonymity of data holders, given the heightened concern raised by the noble Lord, Lord Knight, and the noble Baroness, Lady Kidron?
When it comes to issues of surveillance of our citizens and the use of retention of biometric data, no matter the reason we must provide the highest standards and maximum safeguards to all those who will determine whether an individual has transgressed rules. The Commons’ deliberation on bank spying on welfare claimants would have caused many vulnerable elders distress, as will continuous police profiling of some sections of our communities for perceived fraudulent behaviour and/or acting against national security interests. We must not feel a false sense of security by relying on any individual Ministers to make arbitrary decisions and add another list to surveillance. Like my noble friend Lady Young, I question how the DWP, bank personnel and police officers will implement a law that falls below parliamentary scrutiny and the highest standards of ethics.
We must acknowledge that the Bill has caused wide- spread concerns. I agree with many who have written to me that we require a nationwide education programme to ensure wider public knowledge, and that consumer groups and charities understand thoroughly how they are likely to be affected by the proposed legislation and, more importantly, the potential impact of the proposed power vis-à-vis the relationship with the DWP and other institutions, if we are to avoid thousands of litigations.
In fact, CRISP’s insightful briefing reminds us to consider genuine, meaningful and trustworthy oversight of the Bill, which aims to simplify the regulatory architecture of UK surveillance oversight but risks creating a vacuum in the regulation of digital surveillance, abandoning clear guidance and standards, complicating oversight governance, and creating vulnerabilities for users of these technologies and for the rights of those subjected to them.
The Bill removes the reporting obligations of the Biometric and Surveillance Camera Commissioner’s role on appropriate surveillance use, as has been stated to Parliament and the public, which endangers visibility and the accountability of police activities. This gives extensive powers in relation to the causes raised by the right reverend Prelate the Bishop of St Albans. The Bill must therefore retain the surveillance camera code of practice, which is essential for public trust.
The Bill gives the Secretary of State broad powers to amend our data protection laws via statutory instrument without adequate scrutiny by Parliament. Many fear that such extensive powers cannot possibly be for the public good, given the records of all Governments, be it with regard to the manipulation of facts or institutional profiling of black and other minoritised communities adversely used in the name of national security. This will simply not be accepted by today's digitalised generation, and the proposition that such information can be held indefinitely without remedy or recourse to justice cannot bode well for our nations.
At a glance, the UK GDPR sets out seven principles, including integrity and accountability. These fundamental rights for citizens cannot be guaranteed under the Bill as it is now. I look forward to all of us making the necessary changes to make better laws for public good.
Finally, I wish all our outstanding staff across the House, noble Lords and their families who are celebrating a loving and joyful Christmas. I wish everyone well.
My Lords, I congratulate the noble Lord, Lord de Clifford, on his excellent maiden speech. I am sure that in this area and others he will be a valuable addition to the House.
One of the advantages of speaking towards the end of the debate is that much of what one could have said has already been said. I particularly enjoyed the speech from my noble friend Lord Knight of Weymouth highlighting the way in which the Bill is consistently behind the curve, always fighting the last war. To some extent, that is inevitable in a field like this, which is developing so rapidly, and I am not convinced that sufficient thought has been given to how developments in digital technology require developments in how it is tackled in legislation.
I think we will have an interesting Committee, in which I will participate as much as I can. The Minister will have a busy spring, with at least two major Bills going through. I hope the Whips have taken account of the number of concerns that have been expressed in this debate, and by external bodies, and that enough time will be allowed in Committee. A particular concern is the large number of amendments added at a late stage in the Commons, which have not had sufficient consideration. It will be our job to look at them in detail.
The proposal to allow the inspection of people’s bank accounts with no due cause is a matter of due concern, which has been mentioned by many people in this debate. I highlight the remarks of UK Finance, the representative body for the banking and financial sector. It says:
“These Department for Work and Pensions proposals have been suggested previously, but they are not part of the economic crime plan 2 or fraud strategy, which are the focus of industry efforts in terms of public-private partnership in tackling economic crime”.
UK Finance goes on to suggest that powers should be more narrowly focused, that they should not leave vulnerable customers disadvantaged—as would appear to be the case in the current drafting—and that further consultation is needed with consumer groups and charities to capture the wider needs of people affected by this proposal. It also suggests that the delivery time for this proposal should be extended even further into the future. For the benefit of the Minister, I shall just interpret that by explaining that what it is saying is, “We have no idea where this proposal came from. It has no part in the overall strategy that was being developed to tackle fraud and we want it pushed off into the indefinite future”—in other words, do not bother. Perhaps the Minister will listen to UK Finance.
I want to focus my remarks particularly on health and health data, which is a particular concern. It is so intimate and personal that it requires additional consideration. It is not just another piece of data; this goes to heart of who we are. The Government said in the context of the King’s Speech that this Bill has been written with industry and for industry. Well, quite. It is possible that some of the changes might result in less work for businesses, including those working in healthcare, but the danger is that the additional flexibility which is being proposed will in fact create additional costs because it is less clear and straightforward, there will be increased risks of disclosure of information that should not be disclosed, and the non-standardised regime will just lead to confusion.
Data regulation can slow down the pace of data sharing, increase people’s concerns about risk, and make research and innovation more difficult. Patients and the public generally quite rightly expect particularly high standards in this area, and I have concerns that this Bill makes the situation worse and that its influence is negative rather than positive. This is a danger, because it affects the public’s attitude to health and health data. If people are worried about the disclosure of their information, this impacts on them seeking and taking advantage of healthcare. That affects all of us, so it is not just a matter of personal concern.
One of the big arguments for the disclosure of health data is that it is available for scientific and developmental research. The need for this is recognised and there are additional safeguards. The UK Health Security Agency can reuse data that is collected by the NHS for the business of disease control, and that is something I am sure we all favour. However, the concept that any data can be reused for scientific purposes has grave dangers, particularly when this Bill fails to define tightly enough what the scientific and developmental research amounts to. The definition of scientific research here appears to apply to commercial as well as non-commercial outfits, whether it is funded publicly or is a private development. This is the sort of concern that we are going to have to tackle in Committee to provide people with the protection that they quite rightly expect.
If we look in more detail at health data, we see that it is protected by the Caldicott principles for health and social care data. It is worth reading the eight principles. The first sets the scene. It says, in the context of social care:
“Every proposed use … of confidential information should be clearly defined, scrutinised and documented, with continuing uses regularly reviewed by an appropriate guardian”.
This Bill is in grave danger of moving beyond that level of protection, which has been agreed and which people expect. People want and expect better regulation of their personal data and more say over what happens to it. This Bill moves us away from that.
It is worth looking in this context at the views of the BMA, which is particularly concerned about health data. It emphasises the fact that the public expect high standards and calls on this House to challenge what it regards as the “problematic provisions” and to seek some reassurance from the Government. I will list what the BMA regards as problematic provisions and why it does not like them: Clause 11, which erodes transparency of information to data subjects; Clauses 32, 35, 143 and 144, which risk eroding regulatory independence and freedom; Clause 1, which risks eroding protections for data by narrowing the definition of “personal data”; Clause 14, which risks eroding trust in AI; Clause 17, which risks eroding the expertise and independence of organisational oversight; and Clauses 20 and 21, which risk eroding organisational data governance. We will need to explore all of these issues in Committee. The hope is that they will get the attention that they deserve.
When it comes to medical data, there is an even stronger case, which the Bill needs to tackle straight on, around people’s genetic information. This is the holy grail of data, which people are desperate to get hold of. It says so much about people, their background and their experiences. We need a super level of protection for genetic data. Again, this is something that needs to be tackled in the Bill.
There are other issues of concern that I could mention—for example, the abolition of the Biometrics Commissioner and Surveillance Camera Commissioner. This is a point of particular concern, raised by a number of bodies. It is quite clear that something is being lost by moving these over to a single commissioner. There is a softer power held by the commissioners, which, to be honest, a single commissioner will not have the time or the bandwidth to deal with.
There is also concern that there needs to be explicit provision in the Bill to enable representative bodies, such as trade unions and commercial organisations, to pursue complaints and issues of concern on behalf of individuals. The issue of direct marketing, particularly of financial services, needs to be addressed.
So there is lots to do on this Bill. I hope the Minister recognises that, at this stage, we are just highlighting issues that need to be looked at in detail, and that time will be provided in Committee to deal with all these issues properly.
My Lords, at this late stage in any debate much of the field is likely to have been covered, but, as someone deeply involved in the crafting, drafting and evolution of the EU GDPR while an MEP in Brussels, I declare a strong vested interest in this subject. I hope that the Minister will not be too negative about the work that we did —much of it was done by Brits in Europe—on producing the GDPR in the first place.
I raised this issue at the recent UK-EU Parliamentary Partnership Assembly and in bilateral discussions with the European Parliament’s civil liberties committee, on which I served for many years, on its recent visit to London. Let me be candid: while the GDPR stands as a significant achievement, it is not without need for enhancement or improvement. The world has undergone a seismic shift since the GDPR’s inception, particularly in the realm of artificial intelligence. Both the UK and the EU need to get better at developing smart legislation. Smart legislation is not only adaptive and forward-looking; it is also flexible enough to evolve alongside emerging trends and challenges.
The importance of such legislation is highlighted by the rapid advancement in various sectors, and particularly in areas such as artificial intelligence—as so well referred to by my noble friend Lord Holmes of Richmond—and how our data is used. These fields are evolving at a pace that traditional legislative processes struggle to match. Such an approach is vital, not only to foster innovation but to ensure that regulations remain relevant and effective in a swiftly changing world, helping to maintain our competitive edge while upholding our core values and standards.
The aspirations of this Bill, which is aimed at modernising and streamlining the UK’s data protection framework while upholding stringent standards, are indeed laudable. I regret that, when my noble friend Lord Kamall was speaking about cookies, I was temporarily out of the Chamber enjoying a culinary cookie for lunch. While there may be further advantages to be unearthed in the depths of this complex legislation, so far, the biggest benefit I have seen is its commitment to removing cookie pop-ups. Above all, we must tread carefully to ensure international compliance, which has been referred to by a number of noble Lords, and steadfastly adhere to the bedrock GDPR principles of lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation and citizens’ redress.
On a procedural note, following other noble Lords, the Government’s recent flurry of amendments—I think there were 266 in total, including 38 new clauses and two new schedules, a staggering 240 of which were introduced at the 11th hour—places a key duty on our House to meticulously scrutinise the new legislation line by line. I have heard other speakers refer to my friend, the right honourable Member for Haltemprice and Howden, in the other place, who astutely observed that that House has
“in effect delegated large parts of the work on this important Bill to the House of Lords”.—[Official Report, Commons, 29/11/23; col. 888.]
I have to say that that is wonderful because, for those of us who are always arguing that this is the House that does the work, that is an acknowledgement of its skills and powers. It is a most welcome reference.
I wish to draw the House’s attention briefly to three important terms: adequacy, which noble Lords have heard about, equivalence and approximation. Adequacy in data protection primarily comes from the EU’s legal framework. It describes the standard that non-EU countries must meet to allow free flow of personal data from the EU. The European Commission assesses this adequacy, considering domestic laws and international commitments. The UK currently benefits from the EU’s two data adequacy decisions, which, I remind the House, are unilateral. However, we stand on the cusp of a crucial review in 2024, when the Commission will decide the fate of extending data adequacy for another four years and it has the power to withdraw its decision in the meantime if we threaten the basis for it. This Bill must not increase the risk of that happening.
Equivalence in the realm of data protection signifies that different systems or standards, while not mirror images, offer comparable levels of protection. It is about viewing a non-EU country’s data protection laws through a lens that recognises their parity with GDPR in safeguarding personal data. Past EU adequacy decisions have not demanded a carbon copy of laws; rather, they seek an essentially equivalent regulatory landscape.
Approximation refers to aligning the laws of EU member states with each other. In data protection, it could describe efforts to align national laws with GDPR standards. The imperative of maintaining data adequacy with the EU cannot be overstated; in fact, it has been stated by many noble Lords today. It stands as a top priority for UK business and industry, a linchpin in law enforcement co-operation, and a gateway to other vital databases. The economic stakes are monumental for both sides: EU personal data-enabled services exports to the UK were worth approximately £42 billion in 2018, and exports from the UK to the EU were worth £85 billion.
I commend the Government for listening to concerns that I and others have raised about democratic oversight and the independence of the Information Commissioner’s Office. The amendment to Clause 35, removing the proposal for the Secretary of State to veto ICO codes of practice, was welcome. This move has, I am informed, sent reassuring signals to our friends in Brussels. However, a concern still remains regarding the UK’s new ambition for adequacy partnerships with third countries. The Government’s impact assessment lists the United States, Australia, the Republic of Korea, Dubai International Finance Centre, Singapore and Colombia, with future agreements with India, Brazil, Kenya and Indonesia listed as priorities.
Some of these nations have data standards that may not align with those of the EU or in fact offer fewer safeguards than our current system. I urge extreme caution in this area. We do not want to be in the situation where we gain a data partnership with Kenya but jeopardise our total data adequacy with the EU. Fundamentally, this Bill should not weaken data protection rights and safeguards. It should ensure transparency in data use and decision-making, uphold requirements for data processors to consider the rights and interests of affected individuals and, importantly, not stray too far from international regulations.
I urge my noble friend the Minister and others to see that adopting a policy of permanent dynamic alignment with the EU GDPR is important, engaging actively with the EU as a partner, not just implementing new rules blindly. Protecting and strengthening the UK-EU data partnership offers an opportunity for closer co-operation, benefiting businesses, consumers, innovation and law enforcement; and together, we can reach out to others to encourage them to join these truly international standards.
My Lords, I thank the Minister for his introduction to the Bill today and congratulate the noble Lord, Lord de Clifford, on his maiden speech. I think we all very much appreciated his valuable perspective on SMEs having to grapple with the intricacies of data protection. I very much look forward to his contributions—perhaps in Committee, if he feels brave enough.
The Minister will have heard the concerns expressed throughout the House—not a single speaker failed to express concerns about the contents of the Bill. The right reverend Prelate the Bishop of Southwell and Nottingham reminded us that the retention and enhancement of public trust in data use and sharing is of key importance, but so much of the Bill seems almost entirely motivated by the Government’s desire to be divergent from the EU to get some kind of Brexit dividend.
As we have heard from all around the House, the Bill dilutes where it should strengthen the rights of data subjects. We can then all agree on the benefits of data sharing without the risks involved. The Equality and Human Rights Commission is clearly of that view, alongside numerous others, such as the Ada Lovelace Institute and as many as 26 privacy advocacy groups. Even on the Government’s own estimates, the Bill will have a minimal positive impact on compliance costs—in fact, it will simply lead to companies doing business in Europe having to comply with two sets of regulations.
I will be specific. The noble Lord, Lord Davies of Brixton, set out the catalogue, and I will go through a number of areas where I believe those rights are being diluted. The amended and more subjective definition of “personal data” will narrow the scope of what is considered personal data, as the right reverend Prelate the Bishop of St Albans pointed out. Schedule 1 sets out a new annexe to the GDPR, with the types of processing activities that the Government have determined have a recognised legitimate interest and will not require a legitimate interest human rights balancing test to be carried out. Future Secretaries of State can amend or add to this list of recognised legitimate interests through secondary legislation. As a result, as the noble Baroness, Lady Bennett, pointed out, it will become easier for political parties to target children as young as 14 during election campaigns, even though they cannot vote until they are 16 or 18, depending on the jurisdiction.
The Bill will change the threshold for refusing a subject access request, which will widen the grounds on which an organisation could refuse requests. The noble Lord, Lord Sikka, reminded us of the existing difficulties of making those subject access requests. Clause 12, added on Report in the Commons, further tips power away from the individual’s ability to access data.
There are also changes to the automated decision-making provisions under Article 22 of the GDPR—the noble Lord, Lord Holmes, reminded us of the importance of the human in the loop. The Bill replaces Article 22 with articles that reduce human review of automated decision-making. As the noble Lord, Lord Knight, pointed out, Article 22 should in fact be strengthened so that it applies to partly automated processing as well, and it should give rights to people affected by an automated decision, not just those who provide data. This should be the case especially in the workplace. A decision about you may be determined by data about other people whom you may never have met.
The Bill amends the circumstances in which personal datasets can be reused for research purposes. New clarifying guidance would have been sufficient, but for-profit commercial research is now included. As the noble Lords, Lord Knight and Lord Davies, pointed out and as we discussed in debates on the then Online Safety Bill, the Bill does nothing where it really matters: on public interest researcher access.
The Bill moves away from UK GDPR requirements for mandatory data protection officers, and it also removes the requirement for data protection impact assessments. All this simply sets up a potential dual compliance system with less assurance—with what benefit? Under the new Bill, a controller or processor will be exempt from the duty to keep records, unless they are carrying out high-risk processing activities. But how effective will this be? One of the main ways of demonstrating compliance with GDPR is to have a record of processing activities.
There are also changes to the Information Commissioner’s role. We are all concerned about whether the creation of a new board will enable the ICO to maintain its current level of independence for data adequacy purposes. This is so important, as the noble Baroness, Lady Young, and my noble friend Lord McNally pointed out.
As regards intragroup transfers, there is concern from the National Aids Trust that Clause 5, permitting the intragroup transmission of personal health data
“where that is necessary for … administrative purposes”,
could mean that HIV/AIDS status is inadequately protected in workplace settings.
Schedule 5 to the Bill amends Chapter 5 of the UK GDPR to reform the UK’s regime for international transfers, with potential adverse consequences for business. The noble Lord, Lord Kirkhope, reminded us of the dangers of adopting too low standards internationally. This clearly has the potential to provide less protection for data subjects than the current test.
In Clause 17, the Bill removes a key enabler of collective interests, consultation with those affected by data and processing during the data protection risk assessment process, and it fails to provide alternative opportunities. Then there is the removal of the legal obligation to appoint a representative. This risks data breaches not being reported, takes away a channel of communication used by the ICO to facilitate its investigations, and increases the frustration of UK businesses in dealing with overseas companies that come to the UK market underprepared to comply with the UK GDPR.
Given that catalogue, it is hardly surprising that so many noble Lords have raised the issue of data adequacy. If I read out the list of all the noble Lords who have mentioned it, I would probably mention almost every single speaker in this debate. It is clear that the Bill significantly lowers data protection standards in the UK, as compared with the EU. On these Benches, our view is that this will undermine the basis of the UK’s EU data adequacy. The essential equivalence between the UK and the EU regimes has been critical to business continuity following Brexit. The Government’s own impact assessment acknowledges that, as the UK diverges from the EU GDPR, the risk of the EU revoking its adequacy decisions will increase. So I very much hope that the Minister, in response to all the questions he has been asked about data adequacy, has some pretty good answers, because there is certainly a considerable degree of concern around the House about the future of data adequacy.
In addition, there are aspects of the Bill that are just plain wrong. The Government need to deliver in full on their commitments to bereaved families made during the passage of what became the Online Safety Act, regarding access to their children’s data, as we have heard today from across the House, notably from the noble Baroness, Lady Kidron, in insisting that this is extended to all deaths of children. I very much hope that the Minister will harden up on his assurances at the end of the debate.
The noble Lords, Lord Kamall and Lord Vaux, questioned the abolition of the Surveillance Camera Commissioner, and the diminution of the duties relating to biometric data. Society is witnessing an unprecedented acceleration in the capability and reach of surveillance technologies, particularly live facial recognition, and we need the commissioner and Surveillance Camera Code of Practice in place. As the Ada Lovelace Institute says in its report Countermeasures, we need new and more comprehensive legislation on the use of biometrics, and the Equality and Human Rights Commission agrees with that too.
As regards what the noble Lord, Lord Sikka, described as unrestrained financial powers, inserted at Commons Report stage, Sir Stephen Timms MP, chair of the DWP Select Committee, very rightly expressed strong concerns about this, as did many noble Lords today, including the noble Baroness, Lady Young, and the noble Lords, Lord Knight and Lord Fox. These powers are entirely disproportionate and we will be strongly opposing them.
Then we have the new national security certificates and designation notices, which were mentioned by the right reverend Prelate the Bishop of St Albans. These would give the Home Secretary great and unaccountable powers to authorise the police to violate our privacy rights, through the use of national security certificates and designation notices, without challenge. The Government have failed to explain why they believe these clauses are necessary to safeguard national security.
There is a whole series of missed opportunities during the course of the Bill. As the noble Lord, Lord Knight, said in his opening speech, the Bill was an opportunity to create ethical, transparent and safe standards for AI systems. A number of noble Lords across the House, including the noble Lord, Lord Kamall, the noble Baroness, Lady Young, the right reverend Prelate the Bishop of Southwell and Nottingham, and my noble friend Lord McNally, all said that this is a wasted opportunity to create measures adequate to an era of ubiquitous use of data through AI systems. The noble Baroness, Lady Kidron, in particular talked about this in relation to children, generative AI and educational technology. The noble Lord, Lord Holmes, talked of this in the public sector, where it is so important as well.
The EU has just agreed in principle to a new AI Act. We are miles behind the curve. Then, of course, we have the new identification verification framework. The UK has chosen not to allow private sector digital ID systems to be used for access. Perhaps the Government could explain why that is the case.
There are a number of other areas, such as new models of personal data control, which were advocated as long ago as 2017, with the Hall-Pesenti review. Why are the Government not being more imaginative in that sense? There is also the avoidance of creating a new offence of identity theft. That seems to be a great missed opportunity in this Bill.
As the noble Baroness, Lady Kidron, mentioned, there is the question of holding AI system providers to be legally accountable for the generation of child sexual abuse material online by using their datasets. My noble friend Lord McNally and the noble Lord, Lord Kamall, raised the case of ICO v Experian. Why are the Government not taking the opportunity to correct that case?
In the face of the need to do more to protect citizens’ rights, this Bill is a dangerous distraction. It waters down rights, it is a huge risk to data adequacy, it is wrong in many areas and it is a great missed opportunity in many others. We on these Benches will oppose a Bill which appears to have very few friends around the House. We want to amend a great many of the provisions of the Bill and we want to scrutinise many other aspects of it where the amendments came through at a very late stage. I am afraid the Government should expect this Bill to have a pretty rough passage.
My Lords, first I want to thank all those noble Lords who have spoken today, and actually, one noble Baroness who has not: my colleague, the noble Baroness, Lady Jones. I am sure the whole House will want to wish her a safe and speedy recovery.
While I am name-checking, I would also like to join in the general congratulation of the noble Lord, Lord de Clifford, who, as others have observed, made a valuable case on behalf of small businesses and SMEs generally, and also called, in his words, for investment to assist this sector to deal with the challenges of data protection.
The range of concerns raised is a good indication of the complexity of this Bill and the issues which will keep us pretty busy in Committee, and I am sure well beyond. We have been well briefed; a record number of briefings have been dispatched in our direction, and they have been most welcome in making sure that we are on top of the content of this Bill.
At the outset, let me make it clear that while we support the principle of modernising data protection legislation and making it suitable for a rapidly changing technological landscape, one that is fit for purpose, we join with noble Lords like the noble Lord, Lord Kirkhope, who made the case for ensuring that the legislation is relevant. We need to properly scrutinise this, and we understand the need to simplify the rules and make them clearer for all concerned. Most speakers commented on this real need and desire.
However, as others have said, this Bill represents a missed opportunity to grasp the challenges in front of us. It tinkers rather than reforms, it fails to offer a new direction and it fails to capitalise on the positive opportunities the use of data affords, including making data work for the wider social good. I thought the noble Lord, Lord Holmes, made a good case in saying it is our data and therefore needs to be treated with respect. I do not think this Bill does that.
The Bill fails to build on the important safeguards and protections that have been hard won by others in other fields of legislation covering the digital world, in particular, about the use of personal data that we want to see upheld and strengthened. The noble Baroness, Lady Kidron, made an inspired speech, pleading with us to hold the Government’s feet to the fire on this issue and others.
The Bill also fails to provide the simplicity and certainty that businesses desire, given that it is vital that we retain our data adequacy status with the EU. Therefore, businesses will find themselves navigating two similar but, as others have said, divergent sets of rules, a point well made by the right reverend Prelate the Bishop of St Albans and the noble Lords, Lord Vaux and Lord Kirkhope. In short, it feels like a temporary holding position rather than a blueprint for reform, and I suspect that, all too soon, we will be back here with a new Bill—perhaps a data protection (No. 3) Bill—which will address the more profound issues at the frontier of data use.
Before that, I must take over the points made by my noble friend Lord Knight, who opened the debate for us. It is an affront to our parliamentary system that the Government chose to table 266 amendments on the last available day before Report in the Commons—about 150 pages of amendments to consider in a single debate. The marvellous notes that accompany the Bill had to be expanded by something like a fifth to take account of all these amendments; it has grown over time. Clearly, our Commons colleagues had no way of being able to scrutinise these amendments with any degree of effectiveness, and David Davis made the point that it is down to us now to make sure that that job is well done.
I agree that some of the amendments are technical, but others are very significant, so can the Minister explain why it was felt necessary to rush them through without debate? For example, the new Schedule 1 will grant the Secretary of State the power to require banks, or other financial institutions, to provide the personal data for anyone in receipt of benefits. These include state pensions and universal credit, but they also include other benefits—working tax credit, child tax credit, child benefit, pension credit, jobseeker’s allowance and personal independence payments. That is a long list; we think that it probably covers some 40% of the population. What is the Government’s real need here?
Yesterday, we had a consultation session with the Minister. I asked where the proposals came from, and he was very honest that they were included in a DWP paper on fraud detection some two years ago. Why is it that the amendments were put into the Bill so late in the day, when they have been around and accessible to the Government for two years? Why has there not been any effective consultation on this? Nobody was asked whether they wanted these changes made, and it seems to me that the Government have acted in an entirely high-handed way.
Most of the population will fall into one or the other of the categories, as my noble friend Lady Young and the noble Lord, Lord Vaux, made clear. Some, such as the noble Lord, think that they might be exempted—but, having listened to my list, he may think otherwise. The criteria for these data searches are not clarified in the Bill and have no legislative limit. Why is that the case?
As Mel Stride and the DWP officials made clear when giving evidence to the Work and Pensions Select Committee recently, this is not about accessing individual bank accounts directly where fraud is suspected, it is about asking for bulk data from financial organisations. How will the Government be able to guarantee data security with bulk searches? When were the Government planning to tell the citizens of this country that they were planning to take this new set of powers to look into their accounts? I warn the Minister that I do not think it will go down very well, when the Government fully explain this.
Meanwhile, the banking sector has also raised concerns about the proposals, which it describes as too broad and liable to put vulnerable customers at a disadvantage. The ICO also questions the proportionality of the measure. Let me make our position clear on this: Labour is unreservedly committed to tackling fraud. We will pursue the fraudsters, conmen and claimants who try to take money from the public purse fraudulently or illegally. This includes those involved in tax fraud or dodgy PPE contracts. As our shadow Minister Chris Bryant made clear in the Commons:
“I back 100% any attempt to tackle fraud in the system, and … will work with the Government to get the legislation right, but this is not the way to do it”.—[Official Report, Commons, 29/11/23; col. 887.]
I hope that the Minister can confirm that he will work with stakeholders, banks and ourselves to find a better way to focus on tackling fraud in all its guises.
Another aspect of the Bill that was revealed at a late date are the rules governing democratic engagement, to which a number of Peers have referred today. The Bill extends the opportunities for direct mail marketing for charitable or political purposes. It also allows the Secretary of State to change the rules for the purposes of democratic engagement. It has now become clear that this will allow the Government to switch off the direct marketing rules in the run-up to an election. Currently, parties are not allowed to send emails, texts, voicemails and so on to individuals without their specific consent. We are concerned that changing this rule could transform UK elections. These powers were opposed in the public consultation on the Bill; this is not what the public want. We have to wonder at the motives of the Government in trying to change these rules at such a late stage and with the minimum of scrutiny. This is an issue to which we will return in Committee; I hope that the Minister can come up with a better justification than his colleagues in the Commons were able to.
I turn to other important aspects of the Bill. A number of noble Lords gave examples of how personal rights to information and data protection, which were previously in the GDPR and the Data Protection Act 2018, have been watered down or compromised. For example, subject access requests have been diluted by allowing companies to refuse such requests on the grounds of being excessive or vexatious—terms that, by their very nature, are hard to define—or by allowing the Secretary of State to define who has a recognised, legitimate interest for processing personal data. Similarly, there is no definition in the Bill of what constitutes high-risk processing —risking uncertainty and instability for businesses and the potential misuse of personal data. We will want to explore these definitions in more detail.
A number of noble Lords quite rightly raised the widespread fear of machines making fundamental decisions about our lives with no recourse to a human being to moderate the decision. The impact of this can be felt more widely than an individual data subject—it can impact on a wider group of citizens as decisions are made, for example, on policing priorities, healthcare and education. This can also have a hugely significant impact in the workplace. Obviously, algorithms and data analysis can bring huge benefits to the workplace, cutting out mundane tasks and ensuring greater job satisfaction. But we also need to ensure that workers and their representatives know what data is being collected on them and have an opportunity for human contact, review and redress when an algorithmic system is used to make a decision. For example, we need to avoid a repeat of the experience of the Just Eat couriers who were unfairly sacked by a computer. We will want to explore how the rights of individuals, groups of citizens and workers can better be protected from unfair or biased automated decisions.
The noble Baroness, Lady Kidron, and others have argued the case for new powers needed to give coroners the right to access information held by tech companies on children’s data where there is a suspicion that the online world contributed to their death and demise. This is a huge and tragic issue that the Government have sadly ducked, although the promise to listen that I heard from the Minister was very welcome. We shall ensure that we keep him to that commitment.
Despite all the promises made, however, the Government have broken the trust of bereaved parents who were expecting this issue to be resolved in the Bill. Instead, the amendment addresses only cases where a child has taken their own life. We will do what we can in this Bill to make sure that the commitments made in the Online Safety Act are fully honoured.
On a separate but important point, Clause 2 allows companies to exploit children’s data for commercial purposes. We believe that without further safeguards, children’s rights will be put very much at risk as companies collect information on where they live, what they buy, how they travel and what they study. We will seek to firm up those children’s rights as the Bill goes forward.
On cookie pop-ups, it is widely accepted that the current system is not working, as everyone ignores them and they have become an irritant. But they were there for a purpose—to ensure that the public were informed of the data being kept on them, so we do not believe that simply removing them is the answer. Similarly with nuisance calls, we want to ensure that the new rules are workable by clarifying the responsibilities of telecoms companies.
As I said at the outset, we regard the Bill as a disappointment that fails to harness the huge opportunities that data affords and to build in the appropriate safeguards. My noble friend Lord Knight put his finger on it well, at the front of the debate, when he said that we need a data protection Bill, but not this Bill.
The Government’s answer to a lack of clarity in so many areas of the Bill is to build in huge, sweeping, Henry VIII powers. When reviewing the legislation recently, we managed to count more than 40 proposed statutory instruments. That is an immense amount of power in the hands of the Secretary of State. We do not believe that this is the right way to legislate on a Bill that is so fundamental to people’s lives and the future of our economy. We want to bring these powers back into play so that they have the appropriate level of parliamentary scrutiny.
With this in mind, and taking into account all the concerns raised today, we look forward to a long and fruitful exchange with the Government over the coming months. This will be a Bill that challenges the Government.
My Lords, I sincerely thank all of today’s speakers for their powerful and learned contributions to a fascinating and productive debate. I very much welcome the engagement in this legislation that has been shown from across the House and such a clear setting out, at this early stage, of the important issues and caveats.
As I said, the Bill reflects the extensive process of consultation that the Government have undertaken, with almost 3,000 responses to the document Data: A New Direction, and the support it enjoys from both the ICO and industry groups. The debate in which we have engaged is a demonstration of noble Lords’ desire to ensure that our data protection regime evolves and works more effectively, while maintaining the highest standards of data protection for all.
I will respond to as many of the questions and points raised as I can. I hope noble Lords will forgive me if, in the interests of time and clarity, I do not name every noble Lord who spoke to every issue. A number of noble Lords expressed the wish that the Government remain open to any and all conversations. Should I inadvertently fail to address any problem satisfactorily, I affirm that I am very willing to engage with all noble Lords throughout the Bill’s passage, recognising its importance and, as the noble Lord, Lord Bassam, said, the opportunity it presents to do great good.
Many noble Lords raised concerns that the Bill does not go far enough to protect personal data rights. This is certainly not our intent. The fundamental data protection principles set out in the UK GDPR—as my noble friend Lord Kirkhope pointed out, they include lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, security and accountability—remain at the heart of the UK’s data protection regime. Certain kinds of data, such as health data, remain special categories to which extra protections rightly apply. Changes such as requiring a senior responsible individual, rather than a data protection officer, mean that organisations still need to be accountable for how they process personal data but will have more flexibility about how they manage the data protection risks within their organisations.
On other specific points raised on the data protection framework, I agree that the right of access is key to ensuring transparency in data processing. The proposals do not restrict the right of access for reasonable requests for information and keep reasonable requests free of charge. On the creation of the new recognised legitimate interests lawful grounds, evidence from our consultation indicated that some organisations worried about getting the balancing test wrong, while others said that the need to document the outcome of their assessment could slow down important processing activities.
To promote responsible data sharing in relation to a limited number of public interest tasks, the Bill acknowledges the importance of these activities, which include safeguarding, crime prevention and national security, responding to emergencies and democratic engagement, but data controllers should not be required to do a case-by-case balancing test.
On cookies, the Bill will allow the Secretary of State to remove the need for data controllers to seek consent for other purposes in future, when the appropriate technologies to do so are readily available. The aim is to offer the user a clear, meaningful choice that can be made once and respected throughout their use of the internet. However, before any such powers are used, we will consult further to make sure that people are more effectively enabled to use different technology to set their online preferences.
On democratic engagement, extending the exemption allows a limited number of individuals, such as elected representatives and referendum campaigners, to process political opinions data without consent where this is necessary for their political activities. In a healthy democracy, it is not just registered political parties that may need to process political opinions data, and these amendments reflect that reality. This amendment does not remove existing rights. If people do not want their data processed for these purposes, they can ask the controller to stop doing so at any time. Before laying any regulations under this clause, the Government would need to consult the Information Commissioner and other interested parties, as well as gaining parliamentary approval.
I turn now to concerns raised by many about the independence of the regulator, the Information Commissioner. The ICO remains an independent regulator, accountable to Parliament, not the Government, in its delivery of data protection regulation. The Bill ensures it has the powers it needs to remain the guardian of people’s personal data. It can and does produce guidance on what it deems necessary. The Government welcome this and will work closely with it ahead of and throughout the implementation of this legislation.
New powers will also help to ensure that the Information Commissioner is able to access the evidence he needs to inform investigations and has the time needed to discover and respond to representations. This will result in more informed investigations and better outcomes. The commissioner will be able to require individuals to attend interviews only if he suspects that an organisation has failed to comply with or has committed an offence under data protection legislation. This power is based on existing comparable powers for the Financial Conduct Authority and the Competition and Markets Authority. A person is not required to answer a question if it would breach legal professional privilege or reveal evidence of an offence.
As the noble Lord, Lord Clement-Jones, pointed out, EU adequacy was mentioned by almost everybody, and concerns were raised that the Bill would impact our adequacy agreement with the EU. The Government believe that our reforms are compatible with maintaining our data adequacy decisions from the EU. While the Bill removes the more prescriptive elements of the GDPR, the UK will maintain its high standards of data protection and continue to have one of the closest regimes to the EU in the world after our reform. The test for EU adequacy set out by the Court of Justice of the European Union in the cases relating to UK adequacy decisions requires essential equivalence to the level of protection under the GDPR. It does not require a third country to have exactly the same rules as the EU in order to be considered inadequate. Indeed, 14 countries have EU adequacy, including Japan, New Zealand and Canada. All of these nations pursue independent and often more divergent approaches to data protection.
Regarding our national security practices, in 2020 and 2021, the European Commission carried out a thorough assessment of the UK’s legislation and regulatory framework for personal data, including access by public authorities for national security purposes. It assessed that the UK provides an adequate level of data protection. We maintain an ongoing dialogue with the EU and have a positive, constructive relationship. We will continue to engage regularly with the EU to ensure our reforms are understood.
A great many noble Lords rightly commented on AI regulation, or the lack of it, in the Bill. Existing data protection legislation—the UK GDPR and the Data Protection Act 2018—regulate the development of AI systems and other technologies to the extent that there is personal data involved. This means that the ICO will continue to play an important role in applying the AI principles as they relate to matters of privacy and data protection. The Government’s view is that it would not be effective to regulate the use of AI in this context solely through the lens of data protection.
Article 22 of the UK GDPR is currently the primary piece of UK law setting out the requirements related to automated decision-making, and this Bill sets out the rights that data subjects have to be informed about significant decisions that are taken about them through solely automated means, to seek human review of those decisions and to have them corrected. This type of activity is, of course, increasingly AI-driven, and so it is important to align these reforms with the UK’s wider approach to AI governance that has been published in the White Paper developed by the Office for Artificial Intelligence. This includes ensuring terms such as “meaningful human involvement” remain up to date and relevant, and the Bill includes regulation-making powers to that effect. The White Paper on the regulation of AI commits to a principles-based approach that supports innovation, and we are considering how the framework will apply to the various actors in the AI development and deployment life cycle, with a particular focus on foundation models. We are analysing the views we heard during the White Paper consultation. We will publish a response imminently, and we do not want to get ahead of that process at this point.
I turn to the protection of children. Once again, I thank noble Lords across the House for their powerful comments on the importance of protecting children’s data, including in particular the noble Baroness, Lady Kidron. On the very serious issue of data preservation orders, the Government continue to make it clear—both in public, at the Dispatch Box, and in private discussions—that we are firmly on the side of the bereaved parents. We consider that we have acted in good faith, and we all want the same outcomes for these families struck by tragedy. We are focused on ensuring that no parent is put through the same ordeal as these families in the future.
I recognise the need to give families the answers they require and to ensure there is no gap in the law. Giving families the answers they need remains the Government’s motivation for the amendment in the other place; it is the reason we will ensure that the amendment is comprehensive and is viewed as such by the families. I reassure the House that the Government have heard and understand the concerns raised on this issue, and that is why the Secretary of State, along with Justice Ministers, will work with noble Lords ahead of Committee and carefully listen to their arguments on potential amendments.
I also hear the concerns of the right reverend Prelate the Bishop of St Albans, the noble Lord, Lord Vaux, and the noble Baroness, Lady Young, on surveillance, police powers and police access to data. Abolishing the Surveillance Camera Commissioner will not reduce data protection. The role overlaps with other oversight bodies, which is inefficient and confusing for police and the public. The Bill addresses the duplication, which means that the ICO will continue to regulate data processing across all sectors, including policing. The aim is to improve effective independent oversight, which is key to public confidence. Simplification through consolidation improves consistency and guidance on oversight, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication.
The Government also have a responsibility to safeguard national security. The reports into events such as the Manchester Arena and Fishmongers’ Hall terrorist incidents have clearly noted that better joined-up working between the intelligence services and law enforcement supports that responsibility. This is why the Bill creates the power for designation notices to be issued, enabling joint controllerships between the intelligence services and law enforcement. The Secretary of State must consider the processing contained in the notice to be required for the purpose of safeguarding national security to grant it. This mirrors the high threshold for interference with the right to privacy under Article 8 of the Human Rights Act, which requires that such interference be in accordance with the law and necessary in a democratic society.
Concerns were raised by, among others, the noble Baronesses, Lady Young and Lady Bennett, and the noble Lords, Lord Sikka and Lord Bassam, on the proportionality of the measure helping the Government to tackle both fraud and error. Despite taking positive steps to reduce these losses, the DWP remains reliant on powers derived from legislation that is in part over 20 years old. The DWP published the fraud plan in May 2022. It set out clearly a number of new powers that it would seek to secure when parliamentary time allowed. Tackling fraud and error in the DWP is a priority for the Government but parliamentary time is tight. In the time available, the DWP has prioritised our key third-party data-gathering measure which will help to tackle one of the largest causes of fraud and error in the welfare system. We remain committed to delivering all the legislation outlined in the DWP’s fraud plan when parliamentary time allows.
To develop and test these new proposals, the DWP has been working closely with the industry, which recognises the importance of modernising and strengthening these powers to enable us to better detect fraud and error in the benefit system. This includes collaboration on the practical design, implementation and delivery of this measure, including establishing a working group with banks and the financial industry. The DWP has also regularly engaged with UK finance as well as individual banks, building societies and fintechs during the development of this measure, and continues to do so. It is of course important that where personal data is involved there are appropriate checks and balances. Organisations have a right to appeal against the requirement to comply with a data notice issued by the DWP.
Through our appeal process, the Government would first seek to resolve all disputes by DWP internal review. If this failed, the appeal would be referred to the First-tier Tax Tribunal, as currently is used in similar circumstances by HMRC. The third-party data-gathering powers that the DWP is taking are only broad to the extent that this ensures that they can be future-proofed. This is because the nature of fraud has changed significantly in recent years and continues to change significantly. The current powers that the DWP has are not sufficient to tackle the new kinds of fraud that we are now seeing in the welfare system. We are including all benefits to ensure that benefits such as state pension retain low rates of fraud. The DWP will of course want to focus this measure on addressing areas with a significant fraud or error challenge. The DWP has set out in its fraud plan how it plans to focus the new powers, which in the first instance will be on fraud in universal credit.
I thank noble Lords, particularly the noble Lord, Lord Vaux, for the attention paid to the department’s impact assessment, which sets out the details of this measure and all the others in the Bill. As he notes, it is substantive and thorough and was found to be such by the Regulatory Policy Committee, which gave it a green rating.
I hope that I have responded to most of the points raised by noble Lords today. I look forward to continuing to discuss these and other items raised.
I would like some clarification. The Minister in the other place said:
“I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future”.—[Official Report, Commons, 29/11/23; col. 912.]
Can the noble Viscount explain why the Government still want to focus on recipients of state pension given that there is virtually no fraud? That is about 12.6 million people, so why?
Although proportionately fraud in the state pension is very low, it is still there. That will not be the initial focus, but the purpose is to future-proof the legislation rather than to have to keep coming back to your Lordships’ House.
Let me once again thank all noble Lords for their contributions and engagement. I look forward to further and more detailed debates on these matters and more besides in Committee. I recognise that there are strong views and it is a wide-ranging Bill, so there will be a lot of meat in our sandwich.
I congratulate the noble Lord, Lord de Clifford, on his perfectly judged maiden speech. I thoroughly enjoyed his description of his background and his valuable contributions on the Bill, and I welcome him to this House.
Finally, on a lighter note, I take this opportunity to wish all noble Lords—both those who have spoken in this debate and others—a very happy Christmas and a productive new year, during which I very much look forward to working with them on the Bill.
That the bill be committed to a Grand Committee, and that it be an instruction to the Grand Committee that they consider the bill in the following order:
Clauses 1 to 5, Schedule 1, Clause 6, Schedule 2, Clauses 7 to 14, Schedule 3, Clauses 15 to 24, Schedule 4, Clause 25, Schedules 5 to 7, Clauses 26 to 46, Schedule 8, Clauses 47 to 51, Schedule 9, Clauses 52 to 117, Schedule 10, Clauses 118 to 128, Schedule 11, Clauses 129 to 137, Schedule 12, Clause 138, Schedule 13, Clauses 139 to 142, Schedule 14, Clause 143, Schedule 15, Clauses 144 to 157, Title.
(8 months, 2 weeks ago)
Grand CommitteeMy Lords, we are beginning rather a long journey—at least, it feels a bit like that. I will speak to Amendments 1, 5 and 288, and the Clause 1 stand part notice.
I will give a little context about Clause 1. In a recent speech, the Secretary of State said something that Julia Lopez repeated this morning at a conference I was at:
“The Data Bill that I am currently steering through Parliament with my wonderful team of ministers”—
I invite the Minister to take a bow—
“is just one step in the making of this a reality—on its own it will add £10 billion to our economy and most crucially—we designed it so that the greatest benefit would be felt by small businesses across our country. Cashing in on a Brexit opportunity that only we were prepared to take, and now those rewards are going to be felt by the next generation of founders and business owners in local communities”.
In contrast, a coalition of 25 civil society organisations wrote to the Secretary of State, calling for the Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice and other organisations. On these Benches, we share the concerns about the government proposals. They will seriously weaken data protection rights in the UK and will particularly harm people from marginalised communities.
So that I do not have to acknowledge them at every stage of the Bill, I will now thank a number of organisations. I am slightly taking advantage of the fact that our speeches are not limited but will be extremely limited from Monday onwards—the Minister will have 20 minutes; I, the noble Baroness, Lady Jones, and colleagues will have 15; and Back-Benchers will have 10. I suspect we are into a new era of brevity, but I will take advantage today, believe me. I thank Bates Wells, Big Brother Watch, Defend Digital Me, the Public Law Project, Open Rights Group, Justice, medConfidential, Chris Pounder, the Data & Marketing Association, CACI, Preiskel & Co, AWO, Rights and Security International, the Advertising Association, the National AIDS Trust, Connected by Data and the British Retail Consortium. That is a fair range of organisations that see flaws in the Bill. We on these Benches agree with them and believe that it greatly weakens the existing data protection framework. Our preference, as we expressed at Second Reading, is that the Bill is either completely revised on a massive scale or withdrawn in the course of its passage through the Lords.
I will mention one thing; I do not think the Government are making any great secret of it. The noble Baroness, Lady Kidron, drew my attention to the Keeling schedule, which gives the game away, and Section 2(2). The Information Commissioner will no longer have to pay regard to certain aspects of the protection of personal data—all the words have been deleted, which is quite extraordinary. It is clear that the Bill will dilute protections around personal data processing, reducing the scope of data protected by the safeguards within the existing law. In fact, the Bill gives more power to data users and takes it away from the people the data is about.
I am particularly concerned about the provisions that change the definition of personal data and the purposes for which it can be processed. There is no need to redraft the definitions of personal data, research or the boundaries of legitimate interests. We have made it very clear over a period of time that guidance from the ICO would have been adequate in these circumstances, rather than a whole piece of primary legislation. The recitals are readily available for guidance, and the Government should have used them. More data will be processed, with fewer safeguards than currently permitted, as it will no longer meet the threshold of personal data, or it will be permitted under the new recognised legitimate interest provision, which we will debate later. That combination is a serious threat to privacy rights in the UK, and that is the context of a couple of our probing amendments to Clause 1— I will come on to the clause stand part notice.
As a result of these government changes, data in one organisation’s hands may be anonymous, while that same information in another organisation’s hands can be personal data. The factor that determines whether personal data can be reidentified is whether the appropriate organisational measures and technical safeguards exist to keep the data in question separate from the identity of specific individuals. That is a very clear decision by the CJEU; the case is SRB v EDPS, if the Minister is interested.
The ability to identify an individual indirectly with the use of additional information is due to the lack of appropriate organisational and technical measures. If the organisation had such appropriate measures that separated data into differently silos, it would not be able to use the additional information to identify such an individual. The language of technical and organisational measures is used in the definition of pseudonymisation in Clause 1(3)(d), which refers to “indirectly identifiable” information. If such measures existed, the data would be properly pseudonymised, in which case it would no longer be indirectly identifiable.
A lot of this depends on how data savvy organisations are, so those that are not well organised and do not have the right technology will get a free pass. That cannot be right, so I hope the Minister will respond to that. We need to make sure that personal data remains personal data, even if some may claim it is not.
Regarding my Amendment 5, can the Government explicitly confirm that personal data that is
“pseudonymised in part, but in which other indirect identifiers remain unaltered”
will remain personal data after this clause is passed? Can the Government also confirm that if an assessment is made that some data is not personal data, but that assessment is later shown to be incorrect, the data will have been personal data at all times and should be treated as such by controllers, processors and the Information Commissioner, about whom we will talk when we come to the relevant future clauses.
Amendment 288 simply asks the Government for an impact assessment. If they are so convinced that the definition of personal data will change, they should be prepared to submit to some kind of impact assessment after the Bill comes into effect. Those are probing amendments, and it would be useful to know whether the Government have any intention to assess what the impact of their changes to the Bill would be if they were passed. More importantly, we believe broadly that Clause 1 is not fit for purpose, and that is why we have tabled the clause stand part notice.
As we said, this change will erode people’s privacy en masse. The impacts could include more widespread use of facial recognition and an increase in data processing with minimal safeguards in the context of facial recognition, as the threshold for personal data would be met only if the data subject is on a watchlist and therefore identified. If an individual is not on a watchlist and images are deleted after checking it, the data may not be considered personal and so would not qualify for data protection obligations.
People’s information could be used to train AI without their knowledge or consent. Personal photos scraped from the internet and stored to train an algorithm would no longer be seen as personal data, as long as the controller does not recognise the individual, is not trying to identify them and will not process the data in such a way that would identify them. The police would have increased access to personal information. Police and security services will no longer have to go to court if they want access to genetic databases; they will be able to access the public’s genetic information as a matter of routine.
Personal data should be defined by what type of data it is, not by how easy it is for a third party to identify an individual from it. That is the bottom line. Replacing a stable, objective definition that grants rights to the individual with an unstable, subjective definition that determines the rights an individual has over their data according to the capabilities of the processor is illogical, complex, bad law-making. It is contrary to the very premise of data protection law, which is founded upon personal data rights. We start on the wrong foot in Clause 1, and it continues. I beg to move.
My Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.
When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?
I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.
One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.
My Lords, anonymisation of data is crucially important in this debate. I want to see, through the Bill, a requirement for personal data, particularly medical data, to be held within trusted research environments. This is a well-developed technique and Britain is the leader. It should be a legal requirement. I am not quite sure that we have got that far in the Bill; maybe we will need to return to the issue on Report.
The extent to which pseudonymisation—I cannot say it—is possible is vastly overrated. There is a sport among data scientists of being able to spot people within generally available datasets. For example, the data available to TfL through people’s use of Oyster cards and so on tells you an immense amount of information about individuals. Medical data is particularly susceptible to this, although it is not restricted to medical data. I will cite a simple example from publicly available data.
My Lords, I, too, support the amendments in the name of the noble Lord, Lord Clement-Jones. As this is the first time I have spoken during the passage of the Bill, I should also declare my interests, but it seems that all the organisations I am involved in process data, so I refer the Committee to all the organisations in my entry in the register of interests.
I want to tell a story about the challenges of distinguishing between personal data and pseudonymised data. I apologise for bringing everyone back to the world of Covid, but that was when I realised how possible it is to track down individuals without any of their personal data. Back in November or December 2020, when the first variant of Covid, the Kent variant, was spreading, one test that was positive for the Kent variant came with no personal details at all. The individual who had conducted that test had not filled in any of the information. I was running NHS Test and Trace and we had to try to find that individual, in a very public way. In the space of three days, with literally no personal information—no name, address or sense of where they lived—the team was able to find that human being. Through extraordinary ingenuity, it tracked them down based on the type of tube the test went into—the packaging that was used—and by narrowing down the geography of the number of postcodes where the person might have been ill and in need of help but also in need of identifying all their contacts.
I learned that it was possible to find that one human being, out of a population of 60 million, within three days and without any of their personal information. I tell this story because my noble friend Lord Kamall made such an important point that, at the heart of data legislation is the question of how you build trust in the population. We have to build on firm foundations if the population are to trust that there are reasons why sharing data is hugely valuable societally. To have a data Bill that does not have firm foundations in absolutely and concretely defining personal data is quite a fatal flaw.
Personal data being subjective, as the noble Lord, Lord Clement-Jones, so eloquently set out, immediately starts citizens on a journey of distrusting this world. There is so much in this world that is hard to trust, and I feel strongly that we have to begin with some very firm foundations. They will not be perfect, but we need to go back to a solid definition of “personal data”, which is why I wholeheartedly support the noble Lord’s amendments.
My Lords, I hesitate to make a Second Reading speech, and I know that the noble Lord, Lord Clement-Jones, cannot resist rehearsing these points. However, it is important, at the outset of Committee, to reflect on the Bill in its generality, and the noble Lord did a very good job of precisely that. This is fundamental.
The problem for us with the Bill is not just that it is a collection of subjects—of ideas about how data should be handled, managed and developed—but that it is flawed from the outset. It is a hotchpotch of things that do not really hang together. Several of us have chuntered away in the margins and suggested that it would have been better if the Bill had fallen and there had been a general election—not that the Minister can comment on that. But it would be better, in a way. We need to go back to square one, and many in the Committee are of a like mind.
The noble Baroness, Lady Harding, made a good point about data management, data control and so on. Her example was interesting, because this is about building trust, having confidence in data systems and managing data in the future. Her example was very good, as was that of the noble Lord, Lord Davies, who raised a challenge about how the anonymisation, or pseudonymisation, of data will work and how effective it will be.
We have two amendments in this group. Taken together, they are designed to probe exactly what the practical impacts will be of the proposed changes to Section 3 of the 2018 Act and the insertion of new Section 3A. Amendment 4 calls for the Secretary of State to publish an assessment of the changes within two months of the Bill passing, while Amendment 301 would ensure that the commencement of Clause 1 takes place no earlier than that two-month period. Noble Lords might think this is unduly cautious, but, given our wider concerns about the Bill and its departure from the previously well-understood—
My Lords, a Division having been called, we will adjourn for 10 minutes and resume at 4.48 pm.
As I was saying, it is important for the framework on data protection that we take a precautionary approach. I hope that the Minister will this afternoon be able to provide a plain English explanation of the changes, as well as giving us an assurance that those changes to definitions do not result in watering down the current legislation.
We broadly support Amendments 1 and 5 and the clause stand part notice, in the sense that they provide additional probing of the Government’s intentions in this area. We can see that the noble Lord, Lord Clement-Jones, is trying with Amendment 1 to bring some much-needed clarity to the anonymisation issue and, with Amendment 5, to secure that data remains personal data in any event. I suspect that the Minister will tell us this afternoon that that is already the case, but a significant number of commentators have questioned this, since the definition of “personal data” is seemingly moving away from the EU GDPR standard towards a definition that is more subjective from the perspective of the controller, processor or recipient. We must be confident that the new definition does not narrow the circumstances in which the information is protected as personal data. That will be an important standard for this Committee to understand.
Amendment 288, tabled by the noble Lord, Lord Clement- Jones, seeks a review and an impact assessment of the anonymisation and identifiability of data subjects. Examining that in the light of the EU GDPR seems to us to be a useful and novel way of making a judgment over which regime better suits and serves data subjects.
We will listen with interest to the Minister’s response. We want to be more than reassured that the previous high standards and fundamental principles of data protection will not be undermined and compromised.
I thank all noble Lords who have spoken in this brief, interrupted but none the less interesting opening debate. I will speak to the amendments tabled by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones; I note that I plan to that form of words quite a lot in the next eight sessions on this Bill. I thank them for tabling these amendments so that we can debate what are, in the Government’s view, the significant benefits of Clause 1.
In response to the points from the noble Lord, Lord Clement-Jones, on the appetite for the reforms in the Bill, we take very seriously the criticisms of the parties that he mentioned—the civil society groups—but it is important to note that, when the Government consulted on these reforms, we received almost 3,000 responses. At that time, we proposed to clarify when data would be regarded as anonymous and proposed legislating to confirm that the test for whether anonymous data can be reidentified is relative to the means available to the controller to reidentify the data. The majority of respondents agreed that greater clarity in legislation would indeed be beneficial.
As noble Lords will know, the UK’s data protection legislation applies only to personal data, which is data relating to an identified or identifiable living individual. It does not apply to non-personal, anonymous data. This is important because, if organisations can be sure that the data they are handling is anonymous, they may be able to more confidently put it to good use in important activities such as research and product development. The current data protection legislation is already clear that a person can be identified in a number of ways by reference to details such as names, identification numbers, location data and online identifiers, or via information about a person’s physical, genetic, mental, economic or cultural characteristics. The Bill does not change the existing legislation in this respect.
With regard to genetic information, which was raised by my noble friend Lord Kamall and the noble Lord, Lord Davies, any information that includes enough genetic markers to be unique to an individual is personal data and special category genetic data, even if names and other identifiers have been removed. This means that it is subject to the additional protections set out in Article 9 of the UK GDPR. The Bill does not change this position.
However, the existing legislation is unclear about the specific factors that a data controller must consider when assessing whether any of this information relates to an identifiable living person. This uncertainty is leading to inconsistent application of anonymisation and to anonymous data being treated as personal data out of an abundance of caution. This, in turn, reduces the opportunities for anonymous data to be used effectively for projects in the public interest. It is this difficulty that Clause 1 seeks to address by providing a comprehensive statutory test on identifiability. The test will require data controllers and processors to consider the likelihood of people within or outside their organisations reidentifying individuals using reasonable means. It is drawn from recital 26 of the EU GDPR and should therefore not be completely unfamiliar to most organisations.
I turn now to the specific amendments that have been tabled in relation to this clause. Amendment 1 in the name of the noble Lord, Lord Clement-Jones, would reiterate the position currently set out in the UK GDPR and its recitals: where individuals can be identified without the use of additional information because data controllers fail to put in place appropriate organisational measures, such as technical or contractual safeguards prohibiting reidentification, they would be considered directly identifiable. Technical and organisational measures put in place by organisations are factors that should be considered alongside others under new Section 3A of the Data Protection Act when assessing whether an individual is identifiable from the data being processed. Clause 1 sets out the threshold at which data—and, therefore, personal data—is identifiable and clarifies when data is anonymous.
On the technical capabilities of a respective data controller, these are already relevant factors under current law and ICO guidance in determining whether data is personal. This means that the test of identifiability is already a relative one today in respect of the data controller, the data concerned and the purpose of the processing. However, the intention of the data controller is not a relevant factor under current law, and nor does Clause 1 make it a factor. Clause 1 merely clarifies the position under existing law and follows very closely the wording of recital 26. Let me state this clearly: nothing in Clause 1 introduces the subjective intention of the data controller as a relevant factor in determining identifiability, and the position will remain the same as under the current law and as set out in ICO guidance.
In response to the points made by the noble Lord, Lord Clement-Jones, and others on pseudonymised personal data, noble Lords may be aware that the definition of personal data in Article 4(1) of the UK GDPR, when read in conjunction with the definition of pseudonymisation in Article 4(5), makes it clear that pseudonymised data is personal data, not anonymous data, and is thus covered by the UK’s data protection regime. I hope noble Lords are reassured by that. I also hope that, for the time being, the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment and not press the related Amendment 5, which seeks to make it clear that pseudonymised data is personal data.
Amendment 4 would require the Secretary of State to assess the difference in meaning and scope between the current statutory definition of personal data and the new statutory definition that the Bill will introduce two months after its passing. Similarly, Amendment 288 seeks to review the impact of Clause 1 six months after the enactment of the Bill. The Government feel that neither of these amendments is necessary as the clause is drawn from recital 26 of the EU GDPR and case law and, as I have already set out, is not seeking to substantially change the definition of personal data. Rather, it is seeking to provide clarity in legislation.
I follow the argument, but what we are suggesting in our amendment is some sort of impact assessment for the scheme, including how it currently operates and how the Government wish it to operate under the new legislation. Have the Government undertaken a desktop exercise or any sort of review of how the two pieces of legislation might operate? Has any assessment of that been made? If they have done so, what have they found?
Obviously, the Bill has been in preparation for some time. I completely understand the point, which is about how we can be so confident in these claims. I suggest that I work with the Bill team to get an answer to that question and write to Members of the Committee, because it is a perfectly fair question to ask what makes us so sure.
In the future tense, I can assure noble Lords that the Department for Science, Innovation and Technology will monitor and evaluate the impact of this Bill as a whole in the years to come, in line with cross-government evaluation guidance and through continued engagement with stakeholders.
The Government feel that the first limb of Amendment 5 is not necessary given that, as has been noted, pseudonymised data is already considered personal data under this Bill. In relation to the second limb of the amendment, if the data being processed is actually personal data, the ICO already has powers to require organisations to address non-compliance. These include requiring it to apply appropriate protections to personal data that it is processing, and are backed up by robust enforcement mechanisms.
That said, it would not be appropriate for the processing of data that was correctly assessed as anonymous at the time of processing to retrospectively be treated as processing of personal data and subject to data protection laws, simply because it became personal data at a later point in the processing due to a change in circumstances. That would make it extremely difficult for any organisation to treat any dataset as anonymous and would undermine the aim of the clause, significantly reducing the potential to use anonymous data for important research and development activities.
My Lords, I thank the noble Lords, Lord Kamall, Lord Davies of Brixton and Lord Bassam, and the noble Baroness, Lady Harding, for their support for a number of these amendments. Everybody made a common point about public trust, particularly in the context of health data.
As the noble Lord, Lord Kamall, said, we had a lot of conversations during the passage of the Health and Care Act and the noble Lord and his department increasingly got it: proper communication about the use of personal, patient data is absolutely crucial to public trust. We made quite a bit of progress with NHSE and the department starting to build in safeguards and develop the concept of access to, rather than sharing of, personal data. I heard what the noble Lord, Lord Davies, said about a locked box and I think that having access for research, rather than sharing data around, is a powerful concept.
I found what the Minister said to be helpful. I am afraid that we will have to requisition a lot of wet towels during the passage of the Bill. There are a number of aspects to what he said, but the bottom line is that he is saying that there is no serious divergence from the current definition of personal data. The boot is on the other foot: where is the Brexit dividend? The Minister cannot have it both ways.
I am sure that, as we go through this and the Minister says, “It’s all in recital 26”, my response would be that the ICO could easily develop guidance based on that. That would be splendid; we would not have to go through the agony of contending with this data protection Bill. It raises all those issues and creates a great deal of angst. There are 26 organisations, maybe more— 42, I think—writing to the Secretary of State about one aspect of it or another. The Government have really created a rod for their own back, when they could have created an awful lot of guidance, included a bit on digital identity in the Bill and done something on cookies. What else is there not to like? As I say, the Government have created a rod for their own back.
As regards pseudonymised data, that is also helpful. We will hold the Minister to that as we go through, if the Minister is saying that that is personal data. I am rather disappointed by the response to Amendment 5, but I will take a very close look at it with several wet towels.
We never know quite whether CJEU judgments will be treated as precedent by this Government or where we are under the REUL Act. I could not tell you at this moment. However, it seems that the Minister is again reassuring us that the CJEU’s judgments on personal data are valid and are treated as being part of UK law for this purpose, which is why there is no change to the definition of personal data as far as he is concerned. All he is doing is importing the recitals into Clause 1. I think I need to read the Minister’s speech pretty carefully if I am going to accept that. In the meantime, we move on. I beg leave to withdraw the amendment.
My Lords, I speak to Amendments 2, 3, 9 and 290 in my name. I thank the noble Baronesses, Lady Jones and Lady Harding, and the noble Lord, Lord Clement-Jones, for their support.
This group seeks to secure the principle that children should enjoy the same protections in UK law after this Bill passes into law as they do now. In 2018, this House played a critical role in codifying the principle that children merit special, specific protection in relation to data privacy by introducing the age-appropriate design code into the DPA. Its introduction created a wave of design changes to tech products: Google introduced safe search as its default; Instagram made it harder for adults to contact children via private messaging; Play Store stopped making adult apps available to under-18s; and TikTok stopped sending notifications through the night and hundreds of thousands of underage children were denied access to age-inappropriate services. These are just a handful of the hundreds of changes that have been made, many of them rolled out globally. The AADC served as a blueprint for children’s data privacy, and its provisions have been mirrored around the globe. Many noble Lords will have noticed that, only two weeks ago, Australia announced that it is going to follow the many others who have incorporated or are currently incorporating it into their domestic legislation, saying in the press release that it would align as closely as possible with the UK’s AADC.
As constructed in the Data Protection Act 2018, the AADC sets out the requirements of the UK GDPR as they relate to children. The code is indirectly enforceable; that is to say that the action the ICO can take against those failing to comply is based on the underlying provisions of UK GDPR, which means that any watering down, softening of provisions, unstable definitions—my new favourite—or legal uncertainty created by the Bill automatically waters down, softens and creates legal uncertainty and unstable definitions for children and therefore for child protection. I use the phrase “child protection” deliberately because the most important contribution that the AADC has made at the global level was the understanding that online privacy and safety are interwoven.
Clause 1(2) creates an obligation on the controller or processor to know, or reasonably to know, that an individual is an identifiable living individual. Amendments 2 and 3 would add a further requirement to consider whether that living individual is a child. This would ensure that providers cannot wilfully ignore the presence of children, something that tech companies have a long track record of doing. I want to quote the UK Information Commissioner, who fined TikTok £12.7 million for failing to prevent under-13s accessing that service; he said:
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws … TikTok should have known better. TikTok should have done better … They did not do enough to check who was using their platform”.
I underline very clearly that these amendments would not introduce any requirement for age assurance. The ICO’s guidance on age assurance in the AADC and the provisions in the Online Safety Act already detail those requirements. The amendments simply confirm the need to offer a child a high bar of data privacy or, if you do not know which of your users are children, offer all users that same high bar of data privacy.
As we have just heard, it is His Majesty’s Government’s stated position that nothing in the Bill lessens children’s data privacy because nothing in the Bill lessens UK GDPR, and that the Bill is merely an exercise to reduce unnecessary bureaucracy. The noble Lords who spoke on the first group have perhaps put paid to that and I imagine that this position will be sorely tested during Committee. In the light of the alternative view that the protections afforded to children’s personal data will decline as a result of the Bill, Amendment 9 proposes that the status of children’s personal data be elevated to that of “sensitive personal data”, or special category data. The threshold for processing special category data is higher than for general personal data and the specific conditions include, for example, processing with the express consent of the data subject, processing to pursue a vital interest, processing by not-for-profits or processing for legal claims or matters of substantial public interest. Bringing children’s personal data within that definition would elevate the protections by creating an additional threshold for processing.
Finally, Amendment 290 enshrines the principle that nothing in the Bill should lead to a diminution in existing levels of privacy protections that children currently enjoy. It is essentially a codification of the commitment made by the Minister in the other place:
“The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
Before I sit down, I just want to highlight the Harvard Gazette, which looked at ad revenue from the perspective of children. On Instagram, children account for 16% of ad revenue; on YouTube, 27%; on TikTok, 35%; and on Snap, an extraordinary 41.4%. Collectively, YouTube, Instagram and Facebook made nearly $2 billion from children aged nought to 12, and it will not escape many noble Lords that children aged nought to 12 are not supposed to be on those platforms. Instagram, YouTube and TikTok together made more than $7 billion from 13 to 17 year-olds. The amendments in this group give a modicum of protection to a demographic who have no electoral capital, who are not developmentally adult and whose lack of care is not an unfortunate by-product of the business model, but who have their data routinely extracted, sold, shared and scraped as a significant part of the ad market. It is this that determines the features that deliberately spread, polarise and keep children compulsively online, and it is this that the AADC—born in your Lordships’ House—started a global movement to contain.
This House came together on an extraordinary cross-party basis to ensure that the Online Safety Bill delivered for children, so I say to the Minister: I am not wedded to my drafting, nor to the approach that I have taken to maintain, clause by clause, the bar for children, even when that bar is changed for adults, but I am wedded to holding the tech sector accountable for children’s privacy, safety and well-being. It is my hope and—if I dare—expectation that noble Lords will join me in making sure that the DPDI Bill does not leave this House with a single diminution of data protection for children. To do so is, in effect, to give with one hand and take away with the other.
I hope that during Committee the Minister will come to accept that children’s privacy will be undermined by the Bill, and that he will work with me and others to resolve these issues so that the UK maintains its place as a global leader in children’s privacy and safety. I beg to move.
My Lords, in the nearly nine years that I have been in this House, I have often played the role of bag carrier to the noble Baroness, Lady Kidron, on this issue. In many ways, I am rather depressed that once again we need to make the case that children deserve a higher bar of protection than adults in the digital world. As the noble Baroness set out—I will not repeat it—the age-appropriate design code was a major landmark in establishing that you can regulate the digital world just as you can the physical world. What is more, it is rather joyful that when you do, these extraordinarily powerful tech companies change their products in the way that you want them to.
This is extremely hard-fought ground that we must not lose. It takes us to what feels like a familiar refrain from the Online Safety Act and the Digital Markets, Competition and Consumers Bill, which we are all still engaged in: the question of whether you need to write something in the Bill and whether, by doing so, you make it more clear or less clear.
Does my noble friend the Minister agree with the fundamental principle, enshrined in the Data Protection Act 2018, that children deserve a higher bar of protection in the online world and that children’s data needs to be protected at a much higher level? If we can all agree on that principle first, then the question is: how do we make sure that this Bill does not weaken the protection that children have?
I am trying to remember on which side of the “put it in the Bill or not” debate I have been during discussions on each of the digital Bills that we have all been working on over the last couple of years. We have a really vicious problem where, as I understand it, the Government keep insisting that the Bill does not water down data protection and therefore there is no need to write anything into it to protect children’s greater rights. On the other hand, I also hear that it will remove bureaucracy and save businesses a lot of money. I have certainly been in rooms over the last couple of years where business representatives have told me, not realising I was one of the original signatories to the amendment that created the age-appropriate design code, how dreadful it was because it made their lives much more complicated.
I have no doubt that if we create a sense—which is what it is—that companies do not need to do quite as much as they used to for children in this area, that sense will create, if not a wide-open door, an ajar door that enables businesses to walk through and take the path of least resistance, which is doing less to protect children. That is why, in this case, I come down on the side of wanting to put it explicitly in the Bill, in whatever wording my noble friend the Minister thinks appropriate, that we are really clear that this creates no change at all in the approach for children and children’s data.
That is what this group of amendments is about. I know that we will come back to a whole host of other areas where there is a risk that children’s data could be handled differently from the way envisaged in that hard-fought battle for the age-appropriate design code but, on this group alone, it would be helpful if my noble friend the Minister could help us establish that firm principle and commit to coming back with wording that will firmly establish it in the Bill.
My Lords, I keep getting flashbacks. This one is to the Data Protection Act 2018, although I think it was 2017 when we debated it. It is one of the huge achievements of the noble Baroness, Lady Kidron, to have introduced, and persuaded the Government to introduce, the age-appropriate design code into the Act, and—as she and the noble Baroness, Lady Harding, described—to see it spread around the world and become the gold standard. It is hardly surprising that she is so passionate about wanting to make sure that the Bill does not water down the data rights of children.
I think the most powerful amendment in this group is Amendment 290. For me, it absolutely bottles what we need to do in making sure that nothing in the Bill waters down children’s rights. If I were to choose one of the noble Baroness’s amendments in this group, it would be that one: it would absolutely give the assurance and scotch the point about legal uncertainty created by the Bill.
Both noble Baronesses asked: if the Government are not watering down the Bill, why can they not say that they are not? Why can they not, in a sense, repeat the words of Paul Scully when he was debating the Bill? He said:
“We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.
He uses “our”, so he is taking full ownership of it. He went on:
“Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office”.—[Official Report, Commons, 17/4/23; col. 101.]
I would love that enshrined in the Bill. It would give us a huge amount of assurance.
My Lords, we on the Labour Benches have become co-signatories to the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding. The noble Baroness set out very clearly and expertly the overarching purpose of retaining the level of protection currently afforded by the Data Protection Act 2018. Amendments 2 and 3 specifically stipulate that, where data controllers know, or should reasonably know, that a user is a child, they should be given the data protection codified in that Act. Amendment 9 takes it a stage further and includes children’s data in the definition of sensitive personal data, and gives it the benefit of being treated to a heightened level of protection—quite rightly, too. Finally, Amendment 290—the favourite of the noble Lord, Lord Clement-Jones—attempts to hold Ministers to the commitment made by Paul Scully in the Commons to maintain existing standards of data protection carried over from that 2018 Act.
Why is all this necessary? I suspect that the Minister will argue that it is not needed because Clause 5 already provides for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals and, in particular, of children, who require special protection.
We disagree with that argument. In the interests of brevity and the spirit of the recent Procedure Committee report, which says that we should not repeat each other’s arguments, I do not intend to speak at length, but we have a principal concern: to try to understand why the Government want to depart from the standards of protection set out in the age-appropriate design code—the international gold standard—which they so enthusiastically signed up to just five or six years ago. Given the rising levels of parental concern over harmful online content and well-known cases highlighting the harms that can flow from unregulated material, why do the Government consider it safe to water down the regulatory standards at this precise moment in time? The noble Baroness, Lady Kidron, valuably highlighted the impact of the current regulatory framework on companies’ behaviour. That is exactly what legislation is designed to do: to change how we look at things and how we work. Why change that? As she has argued very persuasively, it is and has been hugely transformative. Why throw away that benefit now?
My attention was drawn to one example of what can happen by a briefing note from the 5Rights Foundation. As it argued, children are uniquely vulnerable to harm and risk online. I thought its set of statistics was really interesting. By the age of 13, 72 million data points have already been collected about children. They are often not used in children’s best interests; for example, the data is often used to feed recommender systems and algorithms designed to keep attention at all costs and have been found to push harmful content at children.
When this happens repeatedly over time, it can have catastrophic consequences, as we know. The coroner in the Molly Russell inquest found that she had been recommended a stream of depressive content by algorithms, leading the coroner to rule that she
“died from an act of self-harm whilst suffering from depression and the negative effects of online content”.
We do not want more Molly Russell cases. Progress has already been made in this field; we should consider dispensing with it at our peril. Can the Minister explain today the thinking and logic behind the changes that the Government have brought forward? Can he estimate the impact that the new lighter-touch regime, as we see it, will have on child protection? Have the Government consulted extensively with those in the sector who are properly concerned about child protection issues, and what sort of responses have the Government received?
Finally, why have the Government decided to take a risk with the sound framework that was already in place and built on during the course of the Online Safety Act? We need to hear very clearly from the Minister how they intend to engage with groups that are concerned about these child protection issues, given the apparent loosening of the current framework. The noble Baroness, Lady Harding, said that this is hard-fought ground; we intend to continue making it so because these protections are of great value to our society.
I am grateful to the noble Baroness, Lady Kidron, for her Amendments 2, 3, 9 and 290 and to all noble Lords who have spoken, as ever, so clearly on these points.
All these amendments seek to add protections for children to various provisions in the Bill. I absolutely recognise the intent behind them; indeed, let me take this opportunity to say that the Government take child safety deeply seriously and agree with the noble Baroness that all organisations must take great care, both when making decisions about the use of children’s data and throughout the duration of their processing activities. That said, I respectfully submit that these amendments are not necessary for three main reasons; I will talk in more general terms before I come to the specifics of the amendments.
First, the Bill maintains a high standard of data protection for everybody in the UK, including—of course—children. The Government are not removing any of the existing data protection principles in relation to lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, accuracy, data security or accountability; nor are they removing the provisions in the UK GDPR that require organisations to build privacy into the design and development of new processing activities.
The existing legislation acknowledges that children require specific protection for their personal data, as they may be less aware of the risks, consequences and safeguards concerned, and of their rights in relation to the processing of personal data. Organisations will need to make sure that they continue to comply with the data protection principles on children’s data and follow the ICO’s guidance on children and the UK GDPR, following the changes we make in the Bill. Organisations that provide internet services likely to be accessed by children will need to continue to comply with their transparency and fairness obligations and the ICO’s age-appropriate design code. The Government welcome the AADC, as Minister Scully said, and remain fully committed to the high standards of protection that it sets out for children.
Secondly, some of the provisions in the Bill have been designed specifically with the rights and safety of children in mind. For example, one reason that the Government introduced the new lawful ground of recognised legitimate interest in Clause 5, which we will debate later, was that some consultation respondents said that the current legislation can deter organisations, particularly in the voluntary sector, from sharing information that might help to prevent crime or protect children from harm. The same goes for the list of exemptions to the purpose limitation principle introduced by Clause 6.
There could be many instances where personal data collected for one purpose may have to be reused to protect children from crime or safeguarding risks. The Bill will provide greater clarity around this and has been welcomed by stakeholders, including in the voluntary sector.
While some provisions in the Bill do not specifically mention children or children’s rights, data controllers will still need to carefully consider the impact of their processing activities on children. For example, the new obligations on risk assessments, record keeping and the designation of senior responsible individuals will apply whenever an organisation’s processing activities are likely to result in high risks to people, including children.
Thirdly, the changes we are making in the Bill must be viewed in a wider context. Taken together, the UK GDPR, the Data Protection Act 2018 and the Online Safety Act 2023 provide a comprehensive legal framework for keeping children safe online. Although the data protection legislation and the age-appropriate design code make it clear how personal data can be processed, the Online Safety Act makes clear that companies must take steps to make their platforms safe by design. It requires social media companies to protect children from illegal, harmful and age-inappropriate content, to ensure they are more transparent about the risks and dangers posed to children on their sites, and to provide parents and children with clear and accessible ways to report problems online when they do arise.
After those general remarks, I turn to the specific amendments. The noble Baroness’s Amendments 2 and 3 would amend Clause 1 of the Bill, which relates to the test for assessing whether data is personal or anonymous. Her explanatory statement suggests that these amendments are aimed at placing a duty on organisations to determine whether the data they are processing relates to children, thereby creating a system of age verification. However, requiring data controllers to carry out widespread age verification of data subjects could create its own data protection and privacy risks, as it would require them to retain additional personal information such as dates of birth.
The test we have set out for reidentification is intended to apply to adults and children alike. If any person is likely to be identified from the data using reasonable means, the data protection legislation will apply. Introducing one test for adults and one for children is unlikely to be workable in practice and fundamentally undermines the clarity that this clause seeks to bring to organisations. Whether a person is identifiable will depend on a number of objective factors, such as the resources and technology available to organisations, regardless of whether they are an adult or a child. Creating wholly separate tests for adults and children, as set out in the amendment, would add unnecessary complexity to the clause and potentially lead to confusion.
As I understand it, the basis on which we currently operate is that children get a heightened level of protection. Is the Minister saying that that is now unnecessary and is captured by the way in which the legislation has been reframed?
I am saying, specifically on Clause 1, that separating the identifiability of children and the identifiability of adults would be detrimental to both but particularly, in this instance, to children.
Amendment 9 would ensure that children’s data is included in the definition of special category data and is subject to the heightened protections afforded to this category of data by Article 9 of the UK GDPR. This could have unintended consequences, because the legal position would be that processing of children’s data would be banned unless specifically permitted. This could create the need for considerable additional legislation to exempt routine and important processing from the ban; for example, banning a Girl Guides group from keeping a list of members unless specifically exempted would be disproportionate. However, more sensitive data such as records relating to children’s health or safeguarding concerns would already be subject to heightened protections in the UK GDPR, as soon as the latter type of data is processed.
I am grateful to the noble Baroness, Lady Kidron, for raising these issues and for the chance to set out why the Government feel that children’s protection is at least maintained, if not enhanced. I hope my answers have, for the time being, persuaded her of the Government’s view that the Bill does not reduce standards of protection for children’s data. On that basis, I ask her also not to move her Amendment 290 on the grounds that a further overarching statement on this is unnecessary and may cause confusion when interpreting the legislation. For all the reasons stated above, I hope that she will now reconsider whether her amendments in this group are necessary and agree not to press them.
Can I press the Minister more on Amendment 290 from the noble Baroness, Lady Kidron? All it does is seek to maintain the existing standards of data protection for children, as carried over from the 2018 Act. If that is all it does, what is the problem with that proposed new clause? In its current formulation, does it not put the intention of the legislation in a place of certainty? I do not quite get why it would be damaging.
I believe it restates what the Government feel is clearly implied or stated throughout the Bill: that children’s safety is paramount. Therefore, putting it there is either duplicative or confusing; it reduces the clarity of the Bill. In no way is this to say that children are not protected—far from it. The Government feel it would diminish the clarity and overall cohesiveness of the Bill to include it.
My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?
Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?
In answer to both questions, what I am saying is that, first, any risk of misinterpreting the Bill with respect to children’s safety is diminished, rather than increased, by the Bill. Overall, it is the Government’s belief and intention that the Bill in no way diminishes the safety or privacy of children online. Needless to say, if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data.
My Lords, that creates another question, does it not? If that is the case, why amend the original wording from the 2018 Act?
Sorry, the 2018 Act? Or is the noble Lord referring to the amendments?
Why change the wording that provides the protection that is there currently?
I assume the noble Lord is referring to Amendment 290.
Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.
My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.
I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.
I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.
I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.
My Lords, I am going to get rather used to introducing a smorgasbord of probing amendments and stand part notices throughout most of the groups of amendments as we go through them. Some of them try to find out the meaning of areas in the Bill and others are rather more serious and object to whole clauses.
I am extremely sympathetic to the use of personal data for research purposes, but Clause 2, which deals with research, is rather deceptive in many ways. That is because “scientific research” and “scientific research purposes” will now be defined to mean
“any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.
The rub lies in the words “commercial or non-commercial activity”. A loosening of requirements on purpose limitation will assist commercial and non-commercial organisations in research and reusing personal data obtained from third parties but will do nothing to increase protection for individual data subjects in these circumstances. That is the real Pandora’s box that we are opening as regards commercial activity. It opens the door to Meta to use our personal data for its own purposes under the guise of research. That seems very much to be a backward step. That is why I tabled Amendment 6, which would require the public interest to apply to all uses under this clause, not just public health uses.
Then there is the question of consent under Clause 3. How is the lawful and moral right of patients, constituents or data subjects to dissent from medical research, for instance, enshrined in this clause? We have seen enough issues relating to health data, opt-outs and so on to begin to destroy public trust, if we are not careful. We have to be extremely advertent to the fact that the communications have to be right; there has to be the opportunity to opt out.
In these circumstances, Amendment 7 would provide that a data subject has been given the opportunity to express dissent or an objection and has not so expressed it. That is then repeated in Clause 26. Again, we are back to public trust: we are not going to gain it. I am very much a glass-half-full person as far as new technology, AI and the opportunities for the use of patient data in the health service are concerned. I am an enthusiast for that, but it has to be done in the right circumstances.
My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.
The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it
“can reasonably be described as scientific”,
but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that
“the data subject is not a child or could or should be known to be a child”,
so that their personal data cannot be used for scientific research purposes to which they have not given their consent.
I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.
In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.
Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.
Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.
Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.
Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?
My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.
I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.
It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.
I just want to say that I agree with what the previous speakers have said. I particularly support Amendment 133; in effect, I have already made my speech on it. At that stage, I spoke about pseudonymised data but I focused my remarks on scientific research. Clearly, I suspect that the Minister’s assurances will not go far enough, although I do not want to pre-empt what he says and I will listen carefully to it. I am sure that we will have to return to this on Report.
I make a small additional point: I am not as content as the noble Baroness, Lady Harding of Winscombe, about commercial research. Different criteria apply; if we look in more detail at ensuring that research data is protected, there may be special factors relating to commercial research that need to be covered in a potential code of practice or more detailed regulations.
My Lords, I am grateful to all noble Lords who have spoken on this group. Amendment 6 to Clause 2, tabled by the noble Lord, Lord Clement-Jones, rightly tests the boundaries on the use of personal data for scientific research and, as he says, begins to ask, “What is the real purpose of this clause? Is it the clarification of existing good practice or is it something new? Do we fully understand what that new proposition is?”
As he said, there is particular public concern about the use of personal health data where it seems that some private companies are stretching the interpretation of “the public good”, for which authorisation for the use of this data was initially freely given, to something much wider. Although the clause seeks to provide some reassurance on this, we question whether it goes far enough and whether there are sufficient protections against the misuse of personal health data in the way the clause is worded.
This raises the question of whether it is only public health research that needs to be in the public interest, which is the way the clause is worded at the moment, because it could equally apply to research using personal data from other public services, such as measuring educational outcomes or accessing social housing. There is a range of uses for personal data. In an earlier debate, we heard about the plethora of data already held on people, much of which individuals do not understand or know about and which could be used for research or to make judgments about them. So we need to be sensitive about the way this might be used. It would be helpful to hear from the Minister why public health research has been singled out for special attention when, arguably, it should be a wider right across the board.
Noble Lords have asked questions about the wider concerns around Clause 2, which could enable private companies to use personal data to develop new products for commercial benefit without needing to inform the data subjects. As noble Lords have said, this is not what people would normally expect to be described as “scientific research”. The noble Baroness, Lady Kidron, was quite right that it has the potential to be unethical, so we need some standards and some clear understanding of what we mean by “scientific research”.
That is particularly important for Amendments 7 and 132 to 134 in the name of the noble Lord, Lord Clement-Jones, which underline the need for data subjects to be empowered and given the opportunity to object to their data being used for a new purpose. Arguably, without these extra guarantees—particularly because there is a lack of trust about how a lot of this information is being used—data subjects will be increasingly reluctant to hand over personal data on a voluntary basis in the first place. It may well be that this is an area where the Information Commissioner needs to provide additional advice and guidance to ensure that we can reap the benefits of good-quality scientific research that is in the public interest and in which the citizens involved can have absolute trust. Noble Lords around the Room have stressed that point.
Finally, we have added our names to the amendments tabled by the noble Baroness, Lady Kidron, on the use of children’s data for scientific research. As she rightly points out, the 2018 Act gave children a higher standard of protection on the uses for which their data is collected and processed. It is vital that this Bill, for all its intents to simplify and water down preceding rights, does not accidentally put at risk the higher protection agreed for children. In the earlier debate, the Minister said that he believed it will not do so. I am not sure that “believe” is a strong enough word here; we need guarantees that go beyond that. I think that this is an issue we will come back to again and again in terms of what is in the Bill and what guarantees exist for that protection.
In particular, there is a concern that relaxing the legal basis on which personal data can be processed for scientific research, including privately funded research carried out by commercial entities, could open the door for children’s data to be exploited for commercial purposes. We will consider the use of children’s data collected in schools in our debate on a separate group but we clearly need to ensure that the handling of pupils’ data by the Department for Education and the use of educational apps by private companies do not lead to a generation of exploited children who are vulnerable to direct marketing and manipulative messaging. The noble Baroness’s amendments are really important in this regard.
I also think that the noble Baroness’s Amendment 145 is a useful initiative to establish a code of practice on children’s data and scientific research. It would give us an opportunity to balance the best advantages of children’s research, which is clearly in the public and personal interest, with the maintenance of the highest level of protection from exploitation.
I hope that the Minister can see the sense in these amendments. In particular, I hope that he will take forward the noble Baroness’s proposals and agree to work with us on the code of practice principles and to put something like that in the Bill. I look forward to his response.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for this series of amendments.
I will first address Amendment 6, which seeks to amend Clause 2. As the noble Lord said, the definitions created by Clause 2, including “scientific research purposes”, are based on the current wording in recital 159 to the UK GDPR. We are changing not the scope of these definitions but their legal status. This amendment would require individual researchers to assess whether their research should be considered to be in the public interest, which could create uncertainty in the sector and discourage research. This would be more restrictive than the current position and would undermine the Government’s objectives to facilitate scientific research and empower researchers.
We have maintained a flexible scope as to what is covered by “scientific research” while ensuring that the definition is still sufficiently narrow in that it can cover only what would reasonably be seen as scientific research. This is because the legislation needs to be able to adapt to the emergence of new areas of innovative research. Therefore, the Government feel that it is more appropriate for the regulator to add more nuance and context to the definition. This includes the types of processing that are considered—
I am sorry to interrupt but it may give the Box a chance to give the Minister a note on this. Is the Minister saying that recital 159 includes the word “commercial”?
I am afraid I do not have an eidetic memory of recital 159, but I would be happy to—
That is precisely why I ask this question in the middle of the Minister’s speech to give the Box a chance to respond, I hope.
Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.
Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.
In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.
On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.
Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.
Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.
Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.
Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.
Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.
The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.
Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.
I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.
Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.
I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.
On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.
Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.
Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.
Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.
Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.
I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.
With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.
Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.
My Lords, that was intriguing. I thank the Minister for his response. It sounds as though, again, guidance would have been absolutely fine, but what is there not to like about the ICO bringing clarity? It was quite interesting that the Minister used the phrase “uncertainty in the sector” on numerous occasions and that is becoming a bit of a mantra as the Bill goes on. We cannot create uncertainty in the sector, so the poor old ICO has been labouring in the vineyard for the last few years to no purpose at all. Clearly there has been uncertainty in the sector of a major description, and all its guidance and all the work that it has put in over the years have been wholly fruitless, really. It is only this Government that have grabbed the agenda with this splendid 300-page data protection Bill that will clarify this for business. I do not know how much they will have to pay to get new compliance officers or whatever it happens to be, but the one thing that the Bill will absolutely not create is greater clarity.
I am a huge fan of making sure that we understand what the recitals have to say, and it is very interesting that the Minister is saying that the recital is silent but the ICO’s guidance is pretty clear on this. I am hugely attracted by the idea of including recital 38 in the Bill. It is another lightbulb moment from the noble Baroness, Lady Kidron, who has these moments, rather like with the age-appropriate design code, which was a huge one.
We are back to the concern, whether in the ICO guidance, the Bill or wherever, that scientific research needs to be in the public interest to qualify and not have all the consents that are normally required for the use of personal data. The Minister said, “Well, of course we think that scientific research is in the public interest; that is its very definition”. So why does only public health research need that public interest test and not the other aspects? Is it because, for instance, the opt-out was a bit of a disaster and 3 million people opted out of allowing their health data to be shared or accessed by GPs? Yes, it probably is.
Do the Government want a similar kind of disaster to happen, in which people get really excited about Meta or other commercial organisations getting hold of their data, a public outcry ensues and they therefore have to introduce a public interest test on that? What is sauce for the goose is sauce for the gander. I do not think that personal data should be treated in a particularly different way in terms of its public interest, just because it is in healthcare. I very much hope that the Minister will consider that.
My Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.
These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.
Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.
As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.
During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.
Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.
Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.
This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.
After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.
This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.
As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.
My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?
This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.
Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.
My Lords, I am also pleased to support these amendments in the name of the noble Baroness, Lady Kidron, to which I have added my name. I am hugely enthusiastic about them, too, and think that this has been a lightbulb moment from the noble Baroness. I very much thank her for doing all of this background work because she has identified the current weakness in the data protection landscape: it is currently predicated on an arrangement between an individual and the organisation that holds their data.
That is an inherently unbalanced power construct. As the noble Baroness said, as tech companies become larger and more powerful, it is not surprising that many individuals feel overwhelmed by the task of questioning or challenging those that are processing their personal information. It assumes a degree of knowledge about their rights and a degree of digital literacy, which we know many people do not possess.
In the very good debate that we had on digital exclusion a few weeks ago, it was highlighted that around 2.4 million people are unable to complete a single basic task to get online, such as opening an internet browser, and that more than 5 million employed adults cannot complete essential digital work tasks. These individuals cannot be expected to access their digital data on their own; they need the safety of a larger group to do so. We need to protect the interests of an entire group that would otherwise be locked out of the system.
The noble Baroness referred to the example of Uber drivers who were helped by their trade union to access their data, sharing patterns of exploitation and subsequently strengthening their employment package, but this does not have to be about just union membership; it could be about the interests of a group of public sector service users who want to make sure that they are not being discriminated against, a community group that wants its bid for a local grant to be treated fairly, and so on. We can all imagine examples of where this would work in a group’s interest. As the noble Baroness said, these proposals would allow any group of people to assign their rights—rights that are more powerful together than apart.
There could be other benefits; if data controllers are concerned about the number of individual requests that they are receiving for data information—and a lot of this Bill is supposed to address that extra work—group requests, on behalf of a data community, could provide economies of scale and make the whole system more efficient.
Like the noble Baroness, I can see great advantages from this proposal; it could lay the foundation for other forms of data innovation and help to build trust with many citizens who currently see digitalisation as something to fear—this could allay those fears. Like the noble Lord, Lord Clement-Jones, I hope the Minister can provide some reassurance that the Government welcome this proposal, take it seriously and will be prepared to work with the noble Baroness and others to make it a reality, because there is the essence of a very good initiative here.
I thank the noble Baroness, Lady Kidron, for raising this interesting and compelling set of ideas. I turn first to Amendments 10 and 35 relating to data communities. The Government recognise that individuals need to have the appropriate tools and mechanisms to easily exercise their rights under the data protection legislation. It is worth pointing out that current legislation does not prevent data subjects authorising third parties to exercise certain rights. Article 80 of the UK GDPR also explicitly gives data subjects the right to appoint not-for-profit bodies to exercise certain rights, including their right to bring a complaint to the ICO, to appeal against a decision of the ICO or to bring legal proceedings against a controller or processor and the right to receive compensation.
The concept of data communities exercising certain data subject rights is closely linked with the wider concept of data intermediaries. The Government recognise the existing and potential benefits of data intermediaries and are committed to supporting them. However, given that data intermediaries are new, we need to be careful not to distort the sector at such an early stage of development. As in many areas of the economy, officials are in regular contact with businesses, and the data intermediary sector is no different. One such engagement is the DBT’s Smart Data Council, which includes a number of intermediary businesses that advise the Government on the direction of smart data policy. The Government would welcome further and continued engagement with intermediary businesses to inform how data policy is developed.
I am sorry, but the Minister used a pretty pejorative word: “distort” the sector. What does he have in mind?
I did not mean to be pejorative; I merely point out that before embarking on quite a far-reaching policy—as noble Lords have pointed out—we would not want to jump the gun prior to consultation and researching the area properly. I certainly do not wish to paint a negative portrait.
It is a moment at which I cannot set a firm date for a firm set of actions, but on the other hand I am not attempting to punt it into the long grass either. The Government do not want to introduce a prescriptive framework without assessing potential risks, strengthening the evidence base and assessing the appropriate regulatory response. For these reasons, I hope that for the time being the noble Baroness will not press these amendments.
The noble Baroness has also proposed Amendments 147 and 148 relating to the role of the Information Commissioner’s Office. Given my response just now to the wider proposals, these amendments are no longer necessary and would complicate the statute book. We note that Clause 35 already includes a measure that will allow the Secretary of State to request the Information Commissioner’s Office to publish a code on any matter that she or he sees fit, so this is an issue we could return to in future if such a code were deemed necessary.
My Lords, I am sorry to keep interrupting the Minister. Can he give us a bit of a picture of what he has in mind? He said that he did not want to distort things at the moment, that there were intermediaries out there and so on. That is all very well, but is he assuming that a market will be developed or is developing? What overview of this does he have? In a sense, we have a very clear proposition here, which the Government should respond to. I am assuming that this is not a question just of letting a thousand flowers bloom. What is the government policy towards this? If you look at the Hall-Pesenti review and read pretty much every government response—including to our AI Select Committee, where we talked about data trusts and picked up the Hall-Pesenti review recommendations —you see that the Government have been pretty much positive over time when they have talked about data trusts. The trouble is that they have not done anything.
Overall, as I say and as many have said in this brief debate, this is a potentially far-reaching and powerful idea with an enormous number of benefits. But the fact that it is far-reaching implies that we need to look at it further. I am afraid that I am not briefed on long-standing—
May I suggest that the Minister writes? On the one hand, he is saying that we will be distorting something—that something is happening out there—but, on the other hand, he is saying that he is not briefed on what is out there or what the intentions are. A letter unpacking all that would be enormously helpful.
I am very happy to write on this. I will just say that I am not briefed on previous government policy towards it, dating back many years before my time in the role.
It was even further. Yes, I am very happy to write on that. For the reasons I have set out, I am not able to accept these amendments for now. I therefore hope that the noble Baroness will withdraw her amendment.
My Lords, I thank the co-signatories of my amendments for their enthusiasm. I will make three very quick points. First, the certain rights that the Minister referred to are complaints after the event when something has gone wrong, not positive rights. The second point of contention I have is whether these are so far-reaching. We are talking about people’s existing rights, and these amendments do not introduce any other right apart from access to put them together. It is very worrying that the Government would see these as a threat when data subjects put together their rights but not when commercial companies put together their data.
Finally, what is the Bill for? If it is not for creating a new and vibrant data protection system for the UK, I am concerned that it undermines a lot of existing rights and will not allow for a flourishing of uses of data. This is the new world: the world of data and AI. We have to have something to offer UK citizens. I would like the Minister to say that he will discuss this further, because it is not quite adequate to nay-say it. I beg leave to withdraw.
(8 months, 1 week ago)
Grand CommitteeMy Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.
Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.
Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,
“among other things … the interests and fundamental rights and freedoms of data subjects”.
The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.
Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:
“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]
I entirely agree with that.
The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.
The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.
As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.
Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data
“to enter the public domain via a public body”,
or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.
On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.
We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.
On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,
“intra-group transmission of personal data”
in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.
As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.
Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.
There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.
I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.
It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.
The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.
I beg to move.
My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.
On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:
“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]
I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.
My Lords, I support the noble Baroness, Lady Kidron, in Amendments 13 and 15, to which I have added my name. Rather than repeat her arguments—as we are now all trying not to do—I want to build on them and point to the debate we had on the first group in Committee, when my noble friend the Minister insisted that the Government had no desire to water down the protections for children in the Bill. In Clause 5, in proposed new paragraph (7) of Article 6, the Government have felt it necessary to be explicit, in that paragraph only, that children might need extra protection. This, on its own, makes me worried that the whole Bill is reducing the protection children have, because the Government felt it necessary to insert new paragraph (7)(b). Interestingly, it refers to,
“where relevant, the need to provide children”
with additional support. But where is that not relevant?
Amendment 13 simply looks to strengthen this—to accept the premise on which the Bill is currently drafted that we need to be explicit where children deserve the right to a higher level of protection, and to get the wording right. Will my noble friend the Minister reconsider? There are two choices here: to state right at the beginning of the Bill that there is a principle that there will be no reduction in children’s right to a higher level of protection, or to do as the Bill currently does and make sure that we get the wording right at every stage as we work through.
My Lords, I thank noble Lords who have spoken to this group. As ever, I am grateful to the Delegated Powers and Regulatory Reform Committee for the care it has taken in scrutinising the Bill. In its 10th report it made a number of recommendations addressing the Henry VIII powers in the Bill, which are reflected in a number of amendments that we have tabled.
In this group, we have Amendment 12 to Clause 5, which addresses the committee’s concerns about the new powers for the Secretary of State to amend new Annexe 1 of Article 6. This sets out the grounds for treating data processing as a recognised legitimate interest. This issue was raised by the noble Lord, Lord Clement-Jones, in his introduction. The Government argue that they are starting with a limited number of grounds and that the list might need to be changed swiftly, hence the need for the Secretary of State’s power to make changes by affirmative regulations.
However, the Delegated Powers and Regulatory Reform Committee argues:
“The grounds for lawful processing of personal data go to the heart of the data protection legislation, and therefore in our view should not be capable of being changed by subordinate legislation”.
It also argues that the Government have not provided strong reasons for needing this power. It recommends that the delegated power in Clause 5(4) should be removed from the Bill, which is what our Amendment 12 seeks to do.
These concerns were echoed by the Constitution Committee, which went one stage further by arguing:
“Data protection is a matter of great importance in maintaining a relationship of trust between the state and the individual”.
It is important to maintain these fundamental individual rights. On that basis, the Constitution Committee asks us to consider whether the breadth of the Secretary of State’s powers in Clauses 5 and 6 is such that those powers should be subject to primary rather than secondary legislation.
I make this point about the seriousness of these issues as they underline the points made by other noble Lords in their amendments in this group. In particular, the noble Lord, Lord Clement-Jones, asked whether any regulations made by the Secretary of State should be the subject of the super-affirmative procedure. We will be interested to hear the Minister’s response, given the concerns raised by the Constitution Committee.
Will the Minister also explain why it was necessary to remove the balancing test, which would require organisations to show why their interest in processing data outweighs the rights of data subjects? Again, this point was made by the noble Lord, Lord Clement-Jones. It would also be helpful if the Minister could clarify whether the new powers for the Secretary of State to amend the recognised legitimate interest could have consequences for data adequacy and whether this has been checked and tested with the EU.
Finally, we also welcome a number of other amendments tabled by the noble Lord, Lord Clement-Jones, in particular those to ensure that direct marketing should be considered a legitimate interest only if there is proper consent. This was one of the themes of the noble Baroness, Lady Kidron, who made, as ever, a very powerful case for ensuring that children specifically should not be subject to direct market as routine and that there should be clear consent.
The noble Baronesses, Lady Kidron and Lady Harding, have once again, quite rightly, brought us back to the Bill needing to state explicitly that children’s rights are not being watered down by it, otherwise we will come back to this again and again in all the clauses. The noble Baroness, Lady Kidron, said that this will be decided on the Floor of the House, or the Minister could give in now and come back with some government amendments. I heartily recommend to the Minister that he considers doing that because it might save us some time. I look forward to the Minister’s response on that and on the Delegated Powers and Regulatory Reform Committee’s recommendations about removing the Secretary of State’s right to amend the legitimate interest test.
My Lords, I rise to speak to Amendments 11, 12, 13, 14, 15, 16, 17 and 18 and to whether Clauses 5 and 7 should stand part of the Bill. In doing so, I thank the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Jones and Lady Kidron, for their amendments. The amendments in the group, as we have heard, relate to Clauses 5 and 7, which make some important changes to Article 6 of the UK GDPR on the lawfulness of processing.
The first amendment in the group, Amendment 11, would create a new lawful ground, under Article 6(1) of UK GDPR, to enable the use of personal data published by public bodies with a person’s consent and to enable processing by public bodies for the benefit of the wider public. The Government do not believe it would be necessary to create additional lawful grounds for processing in these circumstances. The collection and publication of information on public databases, such as the list of company directors published by Companies House, should already be permitted by existing lawful grounds under either Article 6(1)(c), in the case of a legal requirement to publish information, or Article 6(1)(e) in the case of a power.
Personal data published by public bodies can already be processed by other non-public body controllers where their legitimate interests outweigh the rights and interests of data subjects. However, they must comply with their requirements in relation to that personal data, including requirements to process personal data fairly and transparently. I am grateful to the noble Lord, Lord Clement-Jones, for setting out where he thinks the gaps are, but I hope he will accept my reassurances that it should already be possible under the existing legislation and will agree to withdraw the amendment.
On Clause 5, the main objectives introduce a new lawful ground under Article 6(1) of the UK GDPR, known as “recognised legitimate interests”. It also introduces a new annexe to the UK GDPR, in Schedule 1 to the Bill, that sets out an exhaustive list of processing activities that may be undertaken by data controllers under this new lawful ground. If an activity appears on the list, processing may take place without a person’s consent and without balancing the controller’s interests against the rights and interests of the individual: the so-called legitimate interests balancing test.
The activities in the annexe are all of a public interest nature, for example, processing of data where necessary to prevent crime, safeguarding national security, protecting children, responding to emergencies or promoting democratic engagement. They also include situations where a public body requests a non-public body to share personal data with it to help deliver a public task sanctioned by law.
The clause was introduced as a result of stakeholders’ concerns raised in response to the public consultation Data: A New Direction in 2021. Some informed us that they were worried about the legal consequences of getting the balancing test in Article 6(1)(f) wrong. Others said that undertaking the balancing test can lead to delays in some important processing activities taking place.
As noble Lords will be aware, many data controllers have important roles in supporting activities that have a public interest nature. It is vital that data is shared without delay where necessary in areas such as safeguarding, prevention of crime and responding to emergencies. Of course, controllers who share data while relying on this new lawful ground would still have to comply with wider requirements of data protection legislation where relevant, such as data protection principles which ensure that the data is used fairly, lawfully and transparently, and is collected and used for specific purposes.
In addition to creating a new lawful ground of recognised legitimate interests, Clause 5 also clarifies the types of processing activities that may be permitted under the existing legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR. Even if a processing activity does not appear on the new list of recognised legitimate interests, data controllers may still have grounds for processing people’s data without consent if their interests in processing the data are not outweighed by the rights and freedoms that people have in relation to privacy. Clause 5(9) and (10) makes it clear this might be the case in relation to many common commercial activities, such as intragroup transfers.
My Lords, may I just revisit that with the Minister? I fear that he is going to move on to another subject. The Delegated Powers Committee said that it thought that the Government had not provided strong enough reasons for needing this power. The public interest list being proposed, which the Minister outlined, is quite broad, so it is hard to imagine the Government wanting something not already listed. I therefore return to what the committee said. Normally, noble Lords like to listen to recommendations from such committees. There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.
I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.
My Lords, Amendment 19 is consequential on my more substantive Clauses 114 and 115 stand part notices, which are also in this group. I am grateful to the noble Lord, Lord Clement-Jones, for his support.
These amendments all relate to the 150 or so pages of late amendments tabled in the Commons on Report and therefore not given adequate scrutiny before now. No real explanation has been given for why the Government felt it necessary to table the amendments in this way, and this group of amendments comes under the heading of so-called “democratic engagement”. Clause 113 extends a soft opt-in for direct mail marketing for furthering charitable or political objectives, while Clause 114 goes further and allows the Secretary of State to change the direct marketing rules through secondary legislation for the purpose of democratic engagement. This would allow the Government, in the run-up to an election, to switch off the direct mailing rules that apply to political parties.
Like many others, we are highly suspicious of the Government’s motives in introducing these amendments in the run-up to this election. Although we do not have a problem with a softer opt-in for direct mailing for charities, the application of Clause 114 to political parties gives politicians carte blanche to mine voters’ data given in good faith for completely different purposes. It would allow voters to be bombarded with calls, texts and personalised social media without their explicit consent.
When you consider these proposals in the context of other recent moves by the Government to make it harder for some people to vote and to vastly increase the amount of money that can be spent on campaigning in the run-up to an election, you have to wonder what the Government are up to, because these measures have certainly not been requested by Labour. In fact, these measures were not supported by the majority of respondents to the Government’s initial consultation, who wanted the existing rules upheld.
The Advertising Association has told us that it is concerned that switching off the rules could result in an increase in poor practice, such as political lobbying under the guise of research. This is apparently a practice known as “plugging”. It referred us to a report from the previous Information Commissioner on how political parties manage data protection, which provided key recommendations for how political parties could improve. These included providing clearer information about how data will be used and being more transparent about how voters are profiled and targeted via social media platforms. This is the direction our democratic engagement should be going in, with stronger and more honest rules that treat the electorate with respect, not watering down the rules that already exist.
When these proposals were challenged in the Commons on Report, the Minister, John Whittingdale, said:
“We have no immediate plans to use the regulation powers”.—[Official Report, Commons, 29/11/23; col. 912.]
If that is the case, why do the Government not take the proposals off the table, go back to the drawing board by conducting a proper consultation and test whether there is any appetite for these changes? They should also involve the Information Commissioner at an early stage, as he has already gone on record to say that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
Finally, if there are to be any changes, they should be subject to full parliamentary scrutiny and approval.
We believe that Clauses 114 and 115 are taking us in fundamentally the wrong direction, against the interests of the electorate. I look forward to the Minister’s response, but I give notice now that, unless the Government adopt a very different strategy on this issue, we will return to this on Report. I beg to move.
My Lords, I follow the noble Baroness, Lady Jones of Whitchurch, with pleasure, as I agree with everything that she just said. I apologise for having failed to notice this in time to attach my name; I certainly would have done, if I had had the chance.
As the noble Baroness said, we are in an area of great concern for the level of democracy that we already have in our country. Downgrading it further is the last thing that we should be looking at doing. Last week, I was in the Chamber looking at the statutory instrument that saw a massive increase in the spending limits for the London mayoral and assembly elections and other mayoral elections—six weeks before they are held. This is a chance to spend an enormous amount of money; in reality, it is the chance for one party that has the money from donations from interesting and dubious sources, such as the £10 million, to bombard voters in clearly deeply dubious and concerning ways.
We see a great deal of concern about issues such as deepfakes, what might happen in the next general election, malicious actors and foreign actors potentially interfering in our elections. We have to make sure, however, that the main actors conduct elections fairly on the ground. As the noble Baroness, Lady Jones, just set out, this potentially drives a cart and horses through that. As she said, these clauses did not get proper scrutiny in the Commons—as much as that ever happens. As I understand it, there is the potential for us to remove them entirely later, but I should like to ask the Minister some direct questions, to understand what the Government’s intentions are and how they understand the meaning of the clauses.
Perhaps no one would have any problems with these clauses if they were for campaigns to encourage people to register to vote, given that we do not have automatic voter registration, as so many other countries do. Would that be covered by these clauses? If someone were conducting a “get out the vote” campaign in a non-partisan way, simply saying, “Please go out and vote. The election is on this day. You will need to bring along your voter ID”, would it be covered by these clauses? What about an NGO campaigning to stop a proposed new nuclear power station, or a group campaigning for stronger regulations on pesticides or for the Government to take stronger action against ultra-processed food? How do those kinds of politics fit with Clauses 114 and 115? As they are currently written, I am not sure that it is clear what is covered.
There is cause for deep concern, because no justification has been made for these two clauses. I look forward to hearing the Minister’s responses.
My Lords, this weekend, as I was preparing for the amendments to which I have put my name, I made the huge mistake of looking at the other amendments being discussed. As a result, I had a look at this group. I probably should declare an interest as the wife of a Conservative MP; therefore, our household is directly affected by this amendment and these clause stand part notices. I wholeheartedly agree with everything said by the noble Baronesses, Lady Jones and Lady Bennett of Manor Castle.
I have two additional points to make, because I am horrified by these clauses. First, did I miss something, in that we are now defining an adult as being 14-plus? At what point did that happen? I thought that you had the right to vote at 18, so I do not understand why electoral direct marketing should be free to bombard our 14 year-olds. That was my first additional point.
Secondly, I come back to what I said on the first day of Committee: this is all about trust. I really worry that Clauses 114 and 115 risk undermining two important areas where trust really matters. The first is our electoral system and the second is the data that we give our elected representatives, when we go to them not as party representatives but as our representatives elected to help us.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government
“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]
Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.
Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.
Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.
All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.
My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.
Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.
But what is the test of necessity at that age?
The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.
As I recall, the ages are on the electoral register.
I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.
May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.
My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.
I would of course be very happy to continue to engage with the Electoral Commission.
We will continue to work with the ICO to make sure that it is familiar with the plans for commencement and that its plans for guidance fit into that. In parts of the UK where the voting age is 18 and the age of attainment is 16, it would be more difficult for candidates and parties to show that it was necessary or proportionate to process the personal data of 14 and 15 year-olds in reliance on the new lawful ground. In this context, creating an arbitrary distinction between children at or approaching voting age and adults may not be appropriate; in particular, many teenagers approaching voting age may be more politically engaged than some adults. These measures will give parties and candidates a clear lawful ground for engaging them in the process. Accepting this amendment would remove the benefits of greater ease of identification of a lawful ground for processing by elected representatives, candidates and registered political parties, which is designed to improve engagement with the electorate. I therefore hope that the noble Baroness, Lady Jones, will withdraw her amendment.
I now come to the clause stand part notice that would remove Clause 114, which gives the Secretary of State a power to make exceptions to the direct marketing rules for communications sent for the purposes of democratic engagement. As Clause 115 defines terms for the purposes of Clause 114, the noble Baroness, Lady Jones, is also seeking for that clause to be removed. Under the current law, many of the rules applying to electronic communications sent for commercial marketing apply to messages sent by registered political parties, elected representatives and others for the purposes of democratic engagement. It is conceivable that, after considering the risks and benefits, a future Government might want to treat communications sent for the purposes of democratic engagement differently from commercial marketing. For example, in areas where voter turnout is particularly low or there is a need to increase engagement with the electoral process, a future Government might decide that the direct marketing rules should be modified. This clause stand part notice would remove that option.
We have incorporated several safeguards that must be met prior to regulations being laid under this clause. They include the Secretary of State having specific regard to the effect the exceptions could have on an individual’s privacy; a requirement to consult the Information Commissioner and other interested parties, as the Secretary of State considers appropriate; and the regulations being subject to parliamentary approval via the affirmative procedure.
For these reasons, I hope that the noble Baroness will agree to withdraw or not press her amendments.
My Lords, I am pleased that I have sparked such a lively debate. When I tabled these amendments, it was only me and the noble Lord, Lord Clement-Jones, so I thought, “This could be a bit sad, really”, but it has not been. Actually, it has been an excellent debate and we have identified some really good issues.
As a number of noble Lords said, the expression “democratic engagement” is weasel words: what is not to like about democratic engagement? We all like it. Only when you drill down into the proposals do you realise the traps that could befall us. As noble Lords and the noble Baroness, Lady Bennett, rightly said, we have to see this in the context of some of the other moves the Government are pursuing in trying to skew the electoral rules in their favour. I am not convinced that this is as saintly as the Government are trying to pretend.
The noble Baroness, Lady Harding, is absolutely right: this is about trust. It is about us setting an example. Of all the things we can do on data protection that we have control over, we could at least show the electorate how things could be done, so that they realise that we, as politicians, understand how precious their data is and that we do not want to misuse it.
I hope we have all knocked on doors, and I must say that I have never had a problem engaging with the electorate, and actually they have never had a problem engaging with us. This is not filling a gap that anybody has identified. We are all out there and finding ways of communicating that, by and large, I would say the electorate finds perfectly acceptable. People talk to us, and they get the briefings through the door. That is what they expect an election campaign to be about. They do not expect, as the noble Baroness, Lady Harding, said, to go to see their MP about one thing and then suddenly find that they are being sent information about something completely different or that assumptions are being made about them which were never the intention when they gave the information in the first place. I just feel that there is something slightly seedy about all this. I am sorry that the Minister did not pick up a little more on our concerns about all this.
There are some practical things that I think it was helpful for us to have talked about, such as the Electoral Commission. I do not think that it has been involved up to now. I would like to know in more detail what its views are on all this. It is also important that we come back to the Information Commissioner and check in more detail what his view is on all this. It would be nice to have guidance, but I do not think that that will be enough to satisfy us in terms of how we proceed with these amendments.
The Minister ultimately has not explained why this has been introduced at this late stage. He is talking about this as though conceivably, in the future, a Government might want to adopt these rules. If that is the case, I respectfully say that we should come back at that time with a proper set of proposals that go right through the democratic process that we have here in Parliament, scrutinise it properly and make a decision then, rather than being bounced into something at a very late stage.
I have to say that I am deeply unhappy at what the Minister has said. I will obviously look at Hansard, but I may well want to return to this.
My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.
I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.
In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.
Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.
Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.
Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.
The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.
Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.
My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:
“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.
I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.
Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.
I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.
My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.
I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.
I should also declare an interest. I apologise that I did not do so earlier. I worked with a think tank and wrote a series of papers on who regulates the regulators. I still have a relationship with that think tank.
My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.
On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?
On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?
I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?
Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.
I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.
I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?
The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.
The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.
I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, in moving Amendment 24, I will speak also to Amendment 26. I welcome the amendments in the name of the noble Lord, Lord Clement-Jones.
Together, these amendments go to the heart of questioning why the Government have found it necessary to change the grounds for the refusal of a subject access request from “manifestly unfounded” to “vexatious or excessive”. At the moment, Article 15 of the UK GDPR gives data subjects a right of access to find out what personal information an organisation hold on them, how it is using it and whether it is sharing it. This right of access is key to transparency and often underpins people’s ability to exercise other data rights and human rights; for example, it impacts on an individual’s right to privacy in Article 8 of the ECHR and their right to non-discrimination in Article 40 of the same.
The Equality and Human Rights Commission has raised specific concerns about these proposals, arguing that subject access requests
“are a vital mechanism for data subjects to exercise their fundamental rights to privacy and freedom from discrimination”.
It argues that these rights will be even more vital as AI systems are rolled out, using personal information
“in ways that may be less than transparent to data subjects”.
So we must be suspicious as to why these changes are being made and whether they are likely to reduce the legitimate opportunities for data subjects to access their personal information.
This comes back to the mantra of the noble Lord, Lord Clement-Jones, regarding a number of the clauses we have dealt with and, I am sure, ones we have yet to deal with: why are these changes necessary? That is the question we pose as well. Is it simply to give greater clarity, as the Minister in the Commons claimed; or is it to lighten the burden on business—the so-called Brexit dividend—which would result in fewer applications being processed by data controllers? Perhaps the Minister could clarify whether data subject rights will be weakened by these changes.
In the Commons, the Minister, John Whittingdale, also argued that some data search requests are dispro-portionate when the information is of low importance or low relevance to the data subject. However, who has the right to make that decision? How is a data controller in a position to judge how important the information is to an individual? Can the Minister clarify whether the data controller would have the right to ask the data subject their reasons for requesting the information? This is not permitted under the current regime.
A number of stakeholders have argued that the new wording is too subjective and is open to abuse by data controllers who find responding to such requests, by their very nature, vexatious or excessive. For a busy data operator, any extra work could be seen as excessive. Although the Information Commissioner has said that he is clear how these words should be applied, he has also said that they are open to numerous interpretations. Therefore, there is a rather urgent need for the Information Commissioner to provide clear statutory guidance on the application of the terms, so that only truly disruptive requests can be rejected. Perhaps the Minister can clarify whether this is the intention.
In the meantime, our Amendment 24 aims to remove the easy get-out clause for refusing a request by making it clear that the resources available to the controller should not, by itself, be a reason for rejecting an application for information. There is an inevitable cost involved in processing requests, and we need to ensure that it does not become the standard excuse for denying data subjects their rights. Our Amendment 26 would require the data controller to produce evidence of why a request is considered vexatious or excessive if it is being denied. It should not be possible to assert this as a reason without providing the data subject with a clear and justifiable explanation. Amendment 25, from the noble Lord, Lord Clement-Jones, has a similar intent.
We remain concerned about the changes and the impact they will have on established data and human rights. As a number of stakeholders have argued, access to personal data and its uses underpins so many other rights that can be enforced by law. We should not give these rights away easily or without proper justification. I look forward to hearing what the Minister has to say, but without further clarification in the Bill, I doubt whether our concerns will be assuaged. I beg to move.
My Lords, I will say a little bit about my intention to delete this clause altogether. Clause 9 significantly changes the data and privacy landscape, and for the worse. The Constitution Committee’s report on the Bill, published on 25 January, noted:
“Clause 9 amends Article 12 of the UK GDPR to broaden the basis for refusal”—
not for enhancing, but for refusal—
“of a data access request by providing more leeway to ‘data controllers’”.
In the world we live in, there is a huge imbalance of power between corporations, governments, public bodies and individuals. People must have a right to know what information is held about them, and how and when it is used. It is vital in order to check abuses and hold powerful elites to account.
The request for information can, at the moment, be wholly or partly denied, depending on the circumstances. It can be refused if it is considered to be manifestly unfounded or manifestly excessive. These phrases, “manifestly unfounded” and “manifestly excessive”, are fairly well understood. There is already a lot of case law on that. Clause 9, however, lowers the threshold for refusing information from “manifestly unfounded or excessive” to “vexatious or excessive”.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.
A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.
There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.
Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as
“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.
The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.
Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.
In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:
“The new parameters are not intended to be reasons for refusal”,
but rather to give
“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]
But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.
We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a
“reasonable and proportionate search for the personal data and other information”.
This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.
Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate
“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.
We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.
We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.
As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.
From looking at the wording of the Members’ explanatory statements for wishing to leave out Clauses 9 and 36, I do not think that the Minister has addressed this, but does he accept that the Bill now provides a more lax approach? Is this a reduction of the standard expected? To me, “vexatious or excessive” sounds very different from “manifestly unfounded or excessive”. Does he accept that basic premise? That is really the core of the debate; if it is not, we have to look again at the issue of resources, which seems to be the argument to make this change.
If that is the case and this is a dilution, is this where the Government think they will get the savings identified in the impact assessment? It was alleged in the Public Bill Committee that this is where a lot of the savings would come from—we all have rather different views. My first information was that every SME might save about £80 a year then, suddenly, the Secretary of State started talking about £10 billion of benefit from the Bill. Clarification of that would be extremely helpful. There seems to be a dichotomy between the noble Lord, Lord Bassam, saying that this is a way to reduce the burdens on business and the Minister saying that it is all about confident refusal and confidence. He has used that word twice, which is worrying.
I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?
First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.
Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.
But the two terms have a different legal meaning, surely.
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.
I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.
It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.
My Lords, can we join in with the request to see that information in a letter? We would like to see where these savings will be made and how much will, as noble Lords have said, be affected by the clauses that we are debating today.
The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.
I am not quite sure what is being requested because the impact assessment has been not only made but published.
I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.
That begs the question of where on earth the rest is coming from.
Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.
I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.
Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.
On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—
My apologies. The Minister just said that the Government do not collect the data. Therefore, what is the basis for changing the threshold? No data, no reasonable case.
The Government do not collect details of private interactions between those raising SARs and the companies they raise them with. The business case is based on extensive consultation—
I hope that the Government have some data about government departments and the public bodies over which they have influence. Can he provide us with a glimpse of how many requests are received, how many are rejected at the outset, how many go to the commissioners, what the cost is and how the cost is computed? At the moment, it sounds like the Government want to lower the threshold without any justification.
As I say, I do not accept that the threshold is being lowered. On the other hand, I will undertake to find out what information can be reasonably provided. Again, as I said, the independent regulatory committee gave the business case set out a green rating; that is a high standard and gives credibility to the business case calculations, which I will share.
The reforms keep reasonable requests free of charge and instead seek to ensure that controllers can refuse or charge a reasonable fee for requests that are “vexatious or excessive”, which can consume a significant amount of time and resources. However, the scope of the current provision is unclear and, as I said, there are a variety of circumstances where controllers would benefit from being able confidently to refuse or charge the fee.
The Minister used the phrase “reasonable fee”. Can he provide some clues on that, especially for the people who may request information? We have around 17.8 million individuals living on less than £12,570. So, from what perspective is the fee reasonable and how is it determined?
“Reasonable” would be set out in the guidance to be created by the ICO but it would need to reflect the costs and affordability. The right of access remains of paramount importance in the data protection framework.
Lastly, as I said before on EU data adequacy, the Government maintain an ongoing dialogue with the EU and believe that our reforms are compatible with maintaining our data adequacy decisions.
For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore agree to withdraw or not press them.
I thank all noble Lords who have spoken in this debate. I am grateful to my noble friend Lord Sikka for rightly sharing the Constitution Committee’s concerns that, on the face of it, it looks like this is broadening the basis for refusal of data requests. He made an important point about the costs needing to be balanced against the social costs of refusing requests and the social impact that there may be, particularly if it is to do with employment or access to public services.
At the heart of this is that we need to ensure that data controllers are not making subjective judgments about whether a request is reasonable. The Minister says that the Information Commissioner will produce guidance. This is important, as that guidance will be absolutely crucial to making a judgment about whether we think this new regime will be credible. The Minister introduced a new phrase: that the intention is to support “well-intended” requests. Well, then we need to start defining “well intended”. I think we will chase these phrases round and round before we get some proper clarification; it would have helped if it had been in the Bill.
We have also gone round and round a bit on whether the changes in the wording weaken the rights of data subjects and whether they save money. The Minister talked about the 1% saving. I am fascinated by that because it does not seem very much; if it is not very much, why are we doing it? We come back to all of this again. I do not quite know what we are hoping to achieve here.
I will need to look at what the Minister said but we need a lot more clarification on this to be reassured that data subjects will not be refused more and more access to the information they want. I was disappointed to hear the Minister say that the controller can consider resources because that seems to me to be the ultimate get-out clause: if a controller can say that they cannot afford to do the data search, does not that mean that individual rights can be ignored just on that basis? That seems too easy; if somebody does not want to do the piece of work, that is an obvious get-out clause, so I remain concerned about the Minister’s response to that amendment as well.
We have explored a lot of this in a lot of different ways and we have had a good debate. I will look again at Hansard but, for the moment, I beg leave to withdraw my amendment.
My Lords, in moving Amendment 27 in my name, I will also express my support for Amendments 28 to 34. I thank my noble friend Lord Black, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for supporting and signing a number of these amendments.
This is quite a specific issue compared to the matters of high policy that we have been debating this afternoon. There is a specific threat to the continuing ability of companies to use the open electoral register for marketing purposes without undue burdens. Some 37% of registered voters choose not to opt out of their data being used for direct marketing via the open electoral register, so quite a significant proportion of the population openly agrees that that data can be used for direct marketing. It is an essential resource for accurate postal addresses and for organisations such as CACI—I suspect that a number of us speaking have been briefed by it; I thank it for its briefing—and it has been used for more than 40 years without detriment to consumers and with citizens’ full knowledge. The very fact that 63% of people on the electoral register have opted out tells you that this is a conscious choice that people have knowingly made.
Why is it in doubt? A recent First-tier Tribunal ruling in a legal case stated, by implication, that every company using open electoral register data must, by 20 May 2024, notify individuals at their postal addresses whenever their data on the electoral register is used and states that cost cannot be considered “dispro-portionate effort”. That means that organisations that are using the electoral roll would need to contact 24.2 million individuals between now and the middle of May, making it completely practically and financially unviable to use the electoral register at scale.
This group of amendments to Clause 11 aims to address this issue. I fully acknowledge that we have tried to hit the target with a number of shots in this group, and I encourage the Minister, first, to acknowledge that he recognises that this is a real problem that the Bill should be able to address and, secondly, if the wording in individual amendments is not effective or has some unintended consequences that we have missed, I encourage him to respond appropriately.
To be clear, the amendments provide legal certainty about the use of the open electoral register without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained from them directly if that data was obtained from the open electoral register. They provide further clarification of what constitutes “disproportionate effort” under new paragraph (e) in Article 14(5) of the GDPR. These additional criteria include the effort and cost of compliance, the damage and distress caused to the data subjects and the reasonable expectation of the data subjects, which the percentage of people not opting out shows.
Why is this a problem that we need to fix? First, if we do not fix this, we might create in the physical world the very problem that parts of the Bill are trying to address in the digital world: the bombarding of people with lots of information that they do not want to receive, lots of letters telling us that a company is using the electoral roll that we gave it permission to use in the first place. It will also inadvertently give more power to social media companies for targeting because it will make physical direct marketing much harder to target, so SMEs will be forced into a pretty oligopolistic market for social media targeting. Finally, it will mean that we lose jobs and reduce productivity at a time when we are trying to do the opposite.
This is quite a simple issue and there is cross-party support. It is not an issue of great philosophical import, but for the companies in this space, it is very real, and for the people working in this industry, it is about their jobs. Inch by inch, we need to look at things that improve productivity rather than actively destroy it, even when people have agreed to it. With that, I note the hour and I beg to move.
My Lords, I support Amendments 27 to 34, tabled variously by my noble friend Lady Harding, and the noble Lord, Lord Clement-Jones, to which I have added my name. As this is the first time I have spoken in Committee, I declare my interests as deputy chairman of the Telegraph Media Group and president of the Institute of Promotional Marketing and note my other declarations in the register.
The direct marketing industry is right at the heart of the data-driven economy, which is crucial not just to the future of the media and communications industries but to the whole basis of the creative economy, which will power economic growth into the future. The industry has quite rightly welcomed the Bill, which provides a long-term framework for economic growth as well as protecting customers.
However, there is one area of great significance, as my noble friend Lady Harding has just eloquently set out, on which this Bill needs to provide clarity and certainty going forward, namely, the use of the open electoral register. That register is an essential resource for a huge number of businesses and brands, as well as many public services, as they try to build new audiences. As we have heard, it is now in doubt because of a recent legal ruling that could, as my noble friend said, lead to people being bombarded with letters telling them that their data on the OER has been used. That is wholly disproportionate and is not in the interests of the marketing and communications industry or customers.
These sensible amendments would simply confirm the status quo that has worked well for so long. They address the issue by providing legal certainty around the use of the OER. I believe they do so in a proportionate manner that does not in any way compromise any aspect of the data privacy of UK citizens. I urge the Minister carefully to consider these amendments. As my noble friend said, there are considerable consequences of not acting for the creative economy, jobs in direct marketing, consumers, the environment and small businesses.
My Lords, I am extremely grateful to the noble Baroness, Lady Harding, and the noble Lord, Lord Black, for doing all the heavy lifting on these amendments. I of course support them having put forward my own amendments. It is just the luck of the draw that the noble Baroness, Lady Harding, put forward her amendment along with all the others. I have very little to say in this case, and just echo what the noble Lord, Lord Black, said about the fact that the open electoral register has played an important part in the direct marketing, data-driven economy, as it is described. It is particularly interesting that he mentioned the creative industries as well.
The First-tier Tribunal precedent could impact on other public sources of data, including the register of companies, the register of judgments, orders and fines, the land register and the food standards agency register. It could have quite far-reaching implications unless we manage to resolve the issue. There is a very tight timescale. The First-tier Tribunal’s ruling means that companies must notify those on the electoral register by 20 May or be at risk of breaching the law. This is really the best route for trying to resolve the issue. Secondly, the First-tier Tribunal’s ruling states that costs cannot be considered as disproportionate effort. That is why these amendments explicitly refer to that. This is no trivial matter. It is a serious area that needs curing by this Bill, which is a good opportunity to do so.
I shall speak briefly to Clause 11 as a whole standing part. That may seem a bit paradoxical, but it is designed to address issues arising in Article 13, not Article 14. Article 13 of the UK GDPR requires controllers, where they intend to process data that was collected directly from data subjects—as opposed to Article 14 obligations, which apply to personal data not obtained from the data subject—for a new purpose, to inform data subjects of various matters to the extent necessary,
“to ensure fair and transparent processing”.
Clause 11(1) removes this obligation for certain purposes where it would require disproportionate effort. The obligation is already qualified to what is necessary to make processing fair and transparent, the fundamental requirements of the GDPR. If, in these circumstances, processing cannot be made fair and transparent without disproportionate effort, then it should not take place. Clause 11(1) would sidestep the requirement and allow unfair, untransparent processing to go ahead for personal data that the data controllers had themselves collected. Perhaps I should have tabled a rather more targeted amendment, but I hope that noble Lords get the point of the difference between this in terms of Article 13 and Article 14.
My Lords, I rise briefly to support the amendments in the name of my noble friend Lady Harding and the others in this group. She has comprehensively explained their importance; they may not be philosophical, as she says, but they have practical importance. One of the most compelling reasons for us to act is as she so precisely described: if we do not, we create a situation in the real world that the Bill seeks to address in the digital world.
Although this is about direct marketing, allied to it are pressures on advertising revenues and the greater control that is being taken by the larger platforms in this area all the time. The effect that has on revenues means that this is an important issue that deserves a proper response from the Government. I hope that my noble friend the Minister acts in the way that we want by, if not accepting one of these amendments, coming forward with something from the Government.
My Lords, I can also be relatively brief. I thank all noble Lords who have spoken and the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, for their amendments, to many of which I have added my name.
At the heart of this debate is what constitutes a disproportionate or impossibility exemption for providing data to individuals when the data is not collected directly from data subjects. Amendments 29 to 33 provide further clarity on how exemptions on the grounds of disproportionate effort should be interpreted —for example, by taking into account whether there would be a limited impact on individuals, whether they would be caused any distress, what the exemptions were in the first place and whether the information had been made publicly available by a public body. All these provide some helpful context, which I hope the Minister will take on board.
I have also added my name to Amendments 27 and 28 from the noble Baroness, Lady Harding. They address the particular concerns about those using the open electoral register for direct marketing purposes. As the noble Baroness explained, the need for this amendment arises from the legal ruling that companies using the OER must first notify individuals at their postal addresses whenever their data is being used. As has been said, given that individuals already have an opt-out when they register on the electoral roll, it would seem unnecessary and impractical for companies using the register to follow up with individuals each time they want to access their data. These amendments seek to close that loophole and return the arrangements back to the previous incarnation, which seemed to work well.
All the amendments provide useful forms of words but, as the noble Baroness, Lady Harding, said, if the wording is not quite right, we hope that the Minister will help us to craft something that is right and that solves the problem. I hope that he agrees that there is a useful job of work to be done on this and that he provides some guidance on how to go about it.
I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.
I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.
I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.
We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.
These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.
The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.
The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.
I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?
On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.
My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.
Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.
On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.
Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.
The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.
Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.
My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.
I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.
My Lords, this is the first group of amendments covering issues relating to automated decision-making, one of the most interesting areas of data use but also one of the most contested and, for the public at large, one of the most controversial and difficult to navigate. The development of AI and data systems that easily enable automatable decisions could offer huge efficiencies for consumers of public services. Equally, the use of such systems can, if used and regulated in the wrong way, have a devastating impact on people’s lives. If we have learned one thing from the Horizon scandal it is simply that, in the wrong hands and with the wrong system in place, the misuse of data can destroy lives and livelihoods.
Our country has a massive social security system, which includes everything from pension payments to disability income support and, of course, the universal credit system, which covers people entitled to in-work and out-of-work benefits. Over 22 million people receive DWP benefits of one sort or another. If automated decisions make errors in this field the potential to damage lives is enormous, as I am sure the Minister will appreciate.
I turn to the four amendments in the group in the name of my noble friend Lady Jones. Amendments 36 and 37 seek to amend new Article 22A of the UK GDPR and make it clear that protection is provided for profiling operations that lead to decisions. This is important, not least because the clause further reduces the scope for the human review of automated decision-making. Profiling is used as part of this process, and these amendments seek to protect individual data subjects from its effect. We take the view that it is essential that human interaction is involved in making subject access decisions.
Amendment 40 also makes it clear that, in the context of the new Article 22A, for human involvement to be considered meaningful, the review of the decision must be completed by a competent person. One of the positive changes made by the Bill is the introduction of the concept of “meaningful human involvement” in a decision. Meaningful human review is a key component for achieving an appropriate level of oversight over automated decision-making, for protecting individuals from unfair treatment and for offering an avenue for redress. The aim of the amendment is to bring more clarity around what “meaningful human involvement” should consist of. It would require that a review needs to be performed by a person with the necessary competence, training and understanding of the data, and, of course, the authority to alter the decision.
Our Amendment 109 is not so much about building protections as introducing something new and adding to the strength of what is already there. Users have never been able to get personalised explanations of automated decisions but, given the impact that these can have, we feel that systems should be in place for people to understand why a computer has simply said yes or no.
As it stands, the Bill deletes Section 14 of the Data Protection Act 2018 in its entirety. Our amendment would undo that and then add personalisation in. The amendment would retain Section 14 of that Act, which is where most automated decision-making safeguards are currently detailed in law. It would introduce an entitlement for data subjects to receive a personalised explanation of an automated decision made about them. This is based on public attitudes research conducted by the Ada Lovelace Institute, which shows a clear demand for greater transparency over these sorts of decisions.
The amendment also draws on independent legal analysis commissioned by the Ada Lovelace Institute, which found that the generic nature of explanations provided under current law are insufficient for individuals to understand how they have been affected by automated decision-making. This was considered to be a major barrier to meaningful protection from and redress for harms caused by AI. As many noble Lords have made clear in these debates, we have put building trust at the heart of how we get the most from AI and, more particularly, ADM systems.
I turn to the amendments in the name of the noble Lord, Lord Clement-Jones. In essence, they are about—as the noble Lord will, I am sure, explain better than I possibly could—the level of engagement of individuals in decisions about data subject automated decision-making processes. The common thread through the amendments is that they raise the bar in terms of the safeguards for data subjects’ rights and freedoms. We have joined the noble Lord, Lord Clement-Jones, on Amendment 47, and might equally have added our names to the other amendments in the group as we broadly support those too.
Amendment 38A, in the name of the noble Baroness, Lady Bennett, would place an additional requirement under new Article 22A to ensure human engagement in the automated decision-making processes.
I am sure the Committee will want more than warm words from the Minister when he comes to wind up the debate. For all of us, ADM is the here and now; it shapes how we use and consume public services and defines what and who we are. Reducing our protections from its downsides is not to be done lightly and we cannot easily see how that can be justified. I want to hear from the Minister how the Government came to conclude that this was acceptable, not least because, as we will hear in later debates on the Bill, the Government are seeking powers that provide for invasive bulk access to potentially every citizen’s bank accounts. I beg to move the amendments in the name of the noble Baroness, Lady Jones.
My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.
Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.
I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.
The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that
“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.
I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.
I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.
When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.
As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.
The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.
The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.
My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?
That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.
The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.
I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.
Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.
Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.
I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.
The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.
The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.
However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.
At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.
My Lords, this group, in which we have Amendments 41, 44, 45, 49, 50, 98A and 104A and have cosigned Amendments 46 and 48, aims to further the protections that we discussed in the previous group. We are delighted that the noble Lord, Lord Clement-Jones, and others joined us in signing various of these amendments.
The first amendment, Amendment 41, is a straight prohibition of any data processing that would contravene the Equality Act 2010. All legislation should conform to the terms of the Equality Act, so I expect the Minister to confirm that he is happy to accept that amendment. If he is not, I think the Committee will want to understand better why that is the case.
Amendment 44 to new Article 22B of the UK GDPR is, as it says, designed,
“to prevent data subjects from becoming trapped in unfair agreements and being unable to exercise their data rights”,
because of the contract terms. One might envisage some sensitive areas where the exercise of these rights might come into play, but there is nothing that I could see, particularly in the Explanatory Notes, which seeks to argue that point. We have no knowledge of when this might occur, and I see no reason why the legislation should be changed to that effect. Special category data can be used for automated decision-making only if certain conditions are met. It involves high-risk processing and, in our view, requires explicit consent.
The amendments remove performance of a contract as one of the requirements that allows the processing of special category data for reaching significant decisions based on automated processing. It is difficult to envisage a situation where it would be acceptable to permit special category data to be processed in high-risk decisions on a purely automated basis, simply pursuant to a contract where there is no explicit consent.
Furthermore, relying on performance of a contract for processing special category data removes the possibility for data subjects to exercise their data rights, for example, the right to object and the ability to withdraw consent, and could trap individuals in unfair agreements. There is an implicit power imbalance between data subjects and data controllers when entering a contract, and people are often not given meaningful choices or options to negotiate the terms. It is usually a take-it-or-leave-it approach. Thus, removing the criteria for performance of a contract reduces the risks associated with ADM and creates a tighter framework for protection. This also aligns with the current wording of Article 9 of the UK GDPR.
Amendment 45 changes the second condition to include only decisions that are required or authorised by law, with appropriate safeguards, and that are necessary for reasons of substantial public interest. The safeguards are retained from Section 14 of the DPA 2018, with amendments to strengthen transparency provisions.
Amendment 49 seeks to ensure that the protections conferred by Article 22C of the UK GDPR would apply to decisions “solely or partly” based on ADM rather than just “solely”. This would help to maximise the protections that data subjects currently enjoy.
Amendment 50 is another strengthening measure, which would make sure that safeguards in the new Article 22C are alongside rather than instead of those contained in Articles 12 to 15.
Our Amendment 104A would insert a new Section into the 2018 Act, requiring data controllers who undertake high-risk processing in relation to work-related decisions or activities to carry out an additional algorithmic impact assessment and make reasonable mitigations in response to the outcome of that assessment.
I ought to have said earlier that Amendment 98A is a minor part of the consequential text.
An improved workplace-specific algorithmic impact assessment is the best way to remedy clear deficiencies in Clause 20 as drafted, and it signals Labour’s international leadership and alignment with international regulatory and AI ethics initiatives. These are moving towards the pre-emptive evaluation of significant social and workplace impacts by responsible actors, combined with a procedure for ongoing monitoring, which is not always possible. It also moves towards our commitment to algorithmic assurance and will help to ensure that UK businesses are not caught up in what is sometimes described as the “Brussels effect”.
My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.
I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.
With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.
Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.
I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if
“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,
taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.
In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.
Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.
Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.
Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.
It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.
In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.
Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.
Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.
Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.
I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?
I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.
So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.
I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.
The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.
I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.
(8 months, 1 week ago)
Grand CommitteeMy Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.
The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.
The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.
New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.
If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.
As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.
Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:
“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]
That is very much my feeling about the clause as well.
I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.
In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.
Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.
Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.
There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.
My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
My Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.
Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.
Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.
I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.
I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.
It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.
My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.
I apologise and thank the noble Lord for his collegiate approach.
My Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.
Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.
I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended
“by adding or varying safeguards”.
The Delegated Powers Committee quotes the department saying that
“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”
afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.
The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.
We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.
The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.
I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.
I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.
I thank the noble Lords, Lord Clement-Jones and Lord Knight, my noble friend Lord Holmes and the noble Baronesses, Lady Jones, Lady Kidron and Lady Bennett—
I apologise to my noble friend. I cannot be having a senior moment already—we have only just started. I look forward to reading that part in Hansard.
I can reassure noble Lords that data subjects still have the right to object to solely automated decision-making. It is not an absolute right in all circumstances, but I note that it never has been. The approach taken in the Bill complements the UK’s AI regulation framework, and the Government are committed to addressing the risks that AI poses to data protection and wider society. Following the publication of the AI regulation White Paper last year, the Government started taking steps to establish a central AI risk function that brings together policymakers and AI experts with the objective of identifying, assessing and preparing for AI risks. To track identified risks, we have established an initial AI risk register, which is owned by the central AI risk function. The AI risk register lists individual risks associated with AI that could impact the UK, spanning national security, defence, the economy and society, and outlines their likelihood and impact. We have also committed to engaging on and publishing the AI risk register in spring this year.
I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?
Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.
This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.
I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.
I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.
Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.
I am having a senior moment as well. Where are the outcomes written? What are we measuring this against? I like the idea; it sounds great—management terminology—but I presume that it is written somewhere and that we could easily add children’s rights to the outcomes as the noble Baroness suggests. Where are they listed?
My Lords, I think we should try to let the Minister make a little progress and see whether some of these questions are answered.
I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.
I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.
I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.
As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.
Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—
Has the Minister moved on from our Amendments 58 and 59? He was talking about varying safeguards. I am not quite sure where he is.
It is entirely my fault; when I sit down and stand up again, I lose my place.
We would always take the views of the DPRRC very seriously on that. Clearly, the Bill is being designed without the idea in mind of losing or diminishing any of those safeguards; otherwise, it would have simply said in the Bill that we could do that. I understand the concern that, by varying them, there is a risk that they would be diminished. We will continue to find a way to take into account the concerns that the noble Baroness has set out, along with the DPRRC. In the interim, let me perhaps provide some reassurance that that is, of course, not the intention.
My Lords, I feel less reassured after this debate than I did even at the end of our two groups on Monday. I thank all those who spoke in this debate. There is quite a large number of amendments in this group, but a lot of them go in the same direction. I was very taken by what the noble Baroness, Lady Kidron, said: if the Government are offering safeguards and not rights, that is really extremely worrying. I also very much take on board what the noble Baroness, Lady Harding, had to say. Yes, of course we are in favour of automated decision-making, as it will make a big difference to our public services and quite a lot of private businesses, but we have to create the right ground rules around it. That is what we are talking about. We all very much share the question of children having a higher bar. The noble Baroness, Lady Jones, outlined exactly why the Secretary of State’s powers either should not be there or should not be expressed in the way that they are. I very much hope that the Minister will write on that subject.
More broadly, there are huge issues here. I think that it was the noble Baroness, Lady Kidron, who first raised the fact that the Government seem to be regulating in a specific area relating to AI that is reducing rights. The Minister talks about now regulating outcomes, not process. As the noble Baroness, Lady Jones, said, we do not have any criteria—what KPIs are involved? The process is important—the ethics by which decisions are made and the transparency involved. I cannot see that it is simply about whether the outcome is such and such; it is about the way in which people make decisions. I know that people like talking about outcome-based regulation, but it is certainly not the only important aspect of regulation.
On the issue of removing prescriptiveness, I am in favour of ethical prescriptiveness, so I cannot see that the Minister has made a particularly good case for the changes made under Clause 14. He talked about having access to safeguards when they matter most. It would be far preferable to have rights that can be exercised in the face of automated decision-making, in particular workplace protection. At various points during the debates on the Bill we have touched on things such as algorithmic impact assessment in the workplace and no doubt we will touch on it further. That is of great and growing importance, but again there is no recognition of that.
I am afraid that the Minister has not made a fantastic case for keeping Clause 14 and I think that most of us will want to kick the tyres and carry on interrogating whether it should be part of the Bill. In the meantime, I beg leave to withdraw Amendment 53.
My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.
The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.
We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.
We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.
Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that
“automated decision systems … are responsible and minimise harm to individuals and society at large”.
The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are
“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.
This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.
It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.
This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.
Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.
My Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.
AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.
The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.
I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.
On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.
Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.
As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.
Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.
Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.
Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.
Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.
There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.
As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.
My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.
I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.
This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.
What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.
The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.
My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.
I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.
My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.
A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.
Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.
Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.
Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.
My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.
In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.
I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.
The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.
The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:
“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.
One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.
My Lords, the noble Lord, Lord Clement-Jones, has, as ever, ably introduced his Amendments 74, 75, 76, 77 and 78, to the first of which the Labour Benches have added our name. We broadly support all the amendments, but in particular Amendment 74. We also support Amendment 144 which was tabled by the noble Baroness, Lady Kidron, and cosigned by the noble Baroness, Lady Harding, the noble Lord, Lord Clement-Jones and my noble friend Lady Jones.
Amendments 74 to 78 cover the use of the Government’s Algorithmic Transparency Recording Standard—ATRS. We heard a fair bit about this in Committee on Monday, when the Minister prayed it in aid during debates on Clause 14 and Article 22A. The noble Lord, Lord Clement-Jones, outlined its valuable work, which I think everyone in the Committee wants to encourage and see writ large. These amendments seek to aid the transparency that the Minister referred to by publishing reports by public bodies using algorithmic tools where they have a significant influence on the decision-making process. The amendments also seek to oblige the Secretary of State to ensure that public bodies, government departments and contractors using public data have a compulsory transparency reporting scheme in place. The amendments legislate to create impact assessments and root ADM processes in public service that minimise harm and are fair and non-discriminatory in their effect.
The noble Lord, Lord Kamall, made some valuable points about the importance of transparency. His two stories were very telling. It is only right that we have that transparency for the public service and in privately provided services. I think the Minister would be well advised to listen to him.
The noble Lord, Lord Clement-Jones, also alighted on the need for government departments to publish reports under the ATRS in line with their position as set out in the AI regulation White Paper consultation process and response. This would put it on a legislative basis, and I think that is fairly argued. The amendments would in effect create a statutory framework for transparency in the public service use of algorithmic tools.
We see these amendments as forming part of the architecture needed to begin building a place of trust around the increased use of ADM and the introduction of AI into public services. Like the Government and everyone in this Committee, we see all the advantages, but take the view that we need to take the public with us on this journey. If we do not do that, we act at our peril. Transparency, openness and accountability are key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future.
We also support Amendment 144 in the name of the noble Baroness, Lady Kidron, for the very simple reason that in the development of AI technology we should hardwire into practice and procedure using the technology as it affects the interests of children to higher standards, and those higher standards should apply. This has been a constant theme in our Committee deliberations and our approach to child protection. In her earlier speech, the noble Baroness, Lady Harding, passionately argued for the need to get this right. We have been wanting over the past decade in that regard, and now is the moment to put that right and begin to move on this policy area.
The noble Baroness, Lady Kidron, has made the argument for higher standards of protection for children persuasively during all our deliberations, and a code of practice makes good sense. As the noble Baroness, Lady Harding, said, it can either be stand-alone or integrated. In the end, it matters little, but having it there setting the standard is critical to getting this policy area in the right place. The amendment sets out the detail that the commissioner must cover with admirable clarity so that data processors should always have prioritising children’s interests and fundamental rights in their thinking. I am sure that is something that is broadly supported by the whole Committee.
I feel under amazing pressure to get the names right, especially given the number of hours we spend together.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.
Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.
The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.
In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.
More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.
For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.
My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?
I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.
I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.
As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.
On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
It is already explicit that:
“Activities addressed specifically to children shall receive specific attention”.
That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.
Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.
I have a question on this. If the Minister is arguing that this should be by way of amendment of the age-related code, would there not be an argument for giving that code some statutory effect?
I believe that the AADC already has statutory standing.
On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.
Yes, I entirely agree with that, but I add that we need it upstream and downstream.
For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—
Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.
Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.
Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.
Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.
Some 50,000 organisations access that information, but does the Government have any data on it? I am not asking for it now, but maybe the Minister could go away and have a look at this. We have heard that other countries have opened up this data. Are they seeing an increase? That is just a number; it does not tell us how many people are denied access to the data.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
My Lords, I thank the Minister for his response. There are a number of different elements to this group.
The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.
My Lords, I will speak to a number of amendments in this group—Amendments 79, 83, 85, 86, 96, 97, 105 and 107.
Amendment 79 proposes an addition to the amendments to Article 28 of the UK GDPR in Clause 15(4). Article 28 sets out the obligations on processors when processing personal data on behalf of controllers. Currently, paragraph 3(c) requires processors to comply with Article 32 of the UK GDPR, which relates to data security. Amendment 79 adds the requirement for processors also to comply with the privacy-by-design provision in Article 25. Article 25 requires controllers to
“at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects”.
I am not proposing an abdication of responsibility by the controller when it instructs a processor to act on its behalf but, in practice, it is hard for a controller to meet this responsibility at the time of processing if it has delegated the processing to a third party that is not bound by the same requirement. I am not normally associated with the edtech sector, but the amendment is of particular importance to it as schools are controllers but the data of children is being processed.
The amendment ensures that processors would be contractually committed to complying with Article 25. It is particularly relevant to situations where controllers procure AI systems, including facial recognition technology and edtech products. It would be helpful in both the public and private sectors and would address the power asymmetry between controller and processor when the processor is a multinational and solutions are often presented on a take-it-or-leave-it basis.
I hope noble Lords will forgive me if I take Amendment 97 out of turn, as all the others in my name relate to children’s data, whereas Amendment 97, like Amendment 79, applies to all data subjects. Amendment 97 would require public bodies to publish risk assessments to create transparency and accountability. This would also place in statute a provision that is already contained in the ICO’s freedom of information publication scheme guidance. The amendment would also require the Cabinet Office to create and maintain an accessible register of public sector risk assessments to improve accountability.
In the last group, we heard that the way in which public bodies collect and process personal data has far-reaching consequences for all of us. I was moved to lay this amendment after witnessing some egregious examples from the education system. The public have a right to know how bodies such as health authorities, schools, universities, police forces, local authorities and government departments comply with their obligations under UK data law. This amendment is simply about creating trust.
The child-related amendments in this group are in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones. Clause 17 sets out the obligations for the newly created role of “senior responsible individual”, which replaces the GDPR requirement to appoint a data protection officer. The two roles are not equivalent: a DPO is an independent adviser to senior management, while a senior responsible individual would be a member of senior management. Amendment 83 would ensure that those appointed senior responsible individuals have an understanding of the heightened risks and the protections to which children are entitled.
Over the years, I have had many conversations with senior executives at major tech companies and, beyond the lines prepared by their public affairs teams, their understanding of children’s protection is often superficial and their grasp of key issues very limited. In fact, if I had a dollar for every time a tech leader, government affairs person or engineer has said, “I never thought of it that way before”, I would be sitting on quite a fortune.
Amendment 83 would simply ensure that a senior leader who is tasked with overseeing compliance with UK data law knows what he or she is talking about when it comes to children’s privacy, and that it informs the decisions they make. It is a modest proposal, and I hope the Minister will find a way to accept it.
Amendments 85 and 86 would require a controller to consider children’s right to higher standards of privacy than adults for their personal data when carrying out its record-keeping duties. Specifically, Amendment 85 sets out what is appropriate when maintaining records of high-risk processing and Amendment 87 relates to processing that is non-high risk. Creating an express requirement to include consideration of these rights in a data controller’s processing record-keeping obligation is a simple but effective way of ensuring that systems and processes are designed with the needs and rights of children front of mind.
Clause 20 is one of the many fault lines where the gap between the assurances given that children will be just as safe and the words on the page is clear. I make clear that the amendments to Clause 18 that I put forward are, as the noble Lord, Lord Clement-Jones, said on Monday, belt and braces. They do not reach the standard of protection that children currently enjoy under the risk-assessment provisions in Article 35 of the UK GDPR and the age-appropriate design code.
A comparison of what controllers must include in a data protection impact assessment under Article 35(7) and what they would need to cover in an assessment of high-risk processing under Clause 20(3)(d) shows the inadequacies of the latter. Instead of a controller having to include
“a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller”,
under the Bill, the controller needs to include only
“a summary of the purposes of the processing”.
They need to include no systematic description—just a summary. There is no obligation to include information about the processing operations or to explain when and how the controller has determined they are entitled to rely on legitimate interest purpose. Instead of
“an assessment of the necessity and proportionality of the processing operations in relation to the purposes”,
under the Bill, a controller needs to assess only necessity, not proportionality. Instead of
“an assessment of the risks to the rights and freedoms of data subjects”,
under the Bill, a controller does not need to consider rights and freedoms.
As an aside, I note that this conflicts with the proposed amendments to Section 64 of the Data Protection Act 2018 in Clause 20(7)(d), which retains the “rights and freedoms” wording but otherwise mirrors the new downgraded requirements in Clause 20(3)(d). I would be grateful for clarification from the Minister on this point.
Instead of requiring the controller to include information about
“the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned”,
as currently prescribed in Article 35, under the Bill, the controller needs to provide only
“a description of how the controller proposes to mitigate those risks”.
The granularity of what is currently required is replaced by a generalised reference to “a description”. These are not the same bar. My argument throughout Committee is that we need to maintain the bar for processing children’s data.
My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.
I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.
Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.
The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.
The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.
On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.
There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.
Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.
The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.
I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.
At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.
Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:
“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.
However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.
My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.
The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.
One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.
My Lords, I thank all noble Lords who have contributed to this very wide-ranging debate. Our amendments cover a lot of common ground, and we are in broad agreement on most issues, so I hope noble Lords will bear with me if I primarily focus on the amendments that I have tabled, although I will come back to other points.
We have given notice of our intention to oppose Clause 16 standing part of the Bill which is similar to Amendment 80 tabled by the noble Lord, Lord Clement-Jones, which probes why the Government have found it necessary to remove the requirement that companies outside the UK should appoint a representative within the UK. The current GDPR rules apply to all those active in the UK market, regardless of whether their organisation is based or located in the UK. The intention is that the representative will ensure UK compliance and act as a primary source of contact for data subjects. Without this clause, data subjects will be forced to deal with overseas data handlers, with all the cultural and language barriers that might ensue. There is no doubt that this will limit their rights to apply UK data standards.
In addition, as my colleagues in the Commons identified, the removal of the provisions in Clause 16 was not included in the Government’s consultation, so stakeholders have not had the chance to register some of the many practical concerns that they feel will arise from this change. There is also little evidence that compliance with Article 27 is an unnecessary barrier to responsible data use by reputable overseas companies. Again, this was a point made by the noble Lord, Lord Clement-Jones. In fact, the international trend is for more countries to add a representative obligation to their data protection laws, so we are becoming outriders on the global stage.
Not only is this an unnecessary change but, compared to other countries, it will send a signal that our data protection rights are being eroded in the UK. Of course, this raises the spectre of the EU revisiting whether our UK adequacy status should be retained. It also has implications for the different rules that might apply north and south of the border in Ireland so, again, if we are moving away from the standard rules applied by other countries, this has wider implications that we need to consider.
For many reasons, I challenge the Government to explain why this change was felt to be necessary. The noble Lord, Lord Clement-Jones, talked about whether the cost was really a factor. It did not seem that there were huge costs, compared to the benefits of maintaining the current system, and I would like to know in more detail why the Government are doing this.
Our Amendments 81 and 90 seek to ensure that there is a definition of “high-risk processing” in the Bill. The current changes in Clauses 17 and 20 have the effect of watering down data controllers’ responsibilities, from carrying out data protection impact assessments to assessing high-risk processing on the basis of whether it was necessary and what risks are posed. But nowhere does it say what constitutes high-risk processing—it is left to individual organisations to make that judgment—and nowhere does it explain what “necessary” means in this context. Is it also expected to be proportionate, as in the existing standards? This lack of clarity has caused some consternation among stakeholders.
The Equality and Human Rights Commission argues that the proposed wording means that
“data controllers are unlikely to go beyond minimum requirements”,
so the wording needs to be more explicit. It also recommends that
“the ICO be required to provide detailed guidance on how ‘the rights and freedoms of individuals’ are to be considered in an Assessment of High Risk Processing”.
More crucially, the ICO has written to Peers, saying that the Bill should contain a list of
“activities that government and Parliament view as high-risk processing, similar to the current list set out at Article 35(3) of the UK GDPR”.
This is what our Amendments 81 and 90 aim to achieve. I hope the Minister can agree to take these points on board and come back with amendments to achieve this.
The ICO also makes the case for future-proofing the way in which high-risk processing is regulated by making a provision in the Bill for the ICO to further designate high-risk processing activities with parliamentary approval. This would go further than the current drafting of Clause 20, which contains powers for the ICO to give examples of high-risk profiling, but only for guidance. Again, I hope that the Minister can agree to take these points on board and come back with suitable amendments.
Our Amendments 99, 100 and 102 specify the need for wider factors in the proposed risk assessment list to ensure that it underpins our equality laws. Again, this was an issue about which stakeholders have raised concerns. The TUC and the Institute for the Future of Work make the point that data protection impact assessments are a crucial basis for consultation with workers and trade unions about the use of technology at work, and this is even more important as the complexities of AI come on stream. The Public Law Project argues that, without rigorous risk and impact analysis, disproportionate and discriminatory processes could be carried out before the harm comes to light.
The Equality and Human Rights Commission argues that data protection impact assessments
“provide a key mechanism for ensuring equality impacts are assessed when public and private sector organisations embed AI systems in their operations”.
It specifically recommends that express references in Article 35(7) of GDPR to “legitimate interests” and
“the rights and freedoms of data subjects”,
as well as the consultation obligations in Article 35(2), should be retained. I hope that the Minister can agree to take these recommendations on board and come back with suitable amendments to ensure that our equalities legislation is protected.
Our Amendments 106 and 108 focus on the particular responsibilities of data controllers to handle health data with specific obligations. This is an issue that we know, from previous debates, is a major cause for concern among the general public, who would be alarmed if they thought that the protections were being weakened.
The BMA has raised concerns that Clauses 20 and 21 will water down our high standards of data governance, which are necessary when organisations are handling health data. As it says,
“Removing the requirement to conduct a thorough assessment of risks posed to health data is likely to lead to a less diligent approach to data protection for individuals”.
It also argues that removing the requirement for organisations to consult the ICO on high-risk processing is,
“a backward step from good governance … when organisations are processing large quantities of sensitive health data.
Our amendments aim to address these concerns by specifying that, with regard to specific cases, such as the handling of health data, prior consultation with the ICO should remain mandatory. I hope that the Minister will see the sense in these amendments and recognise that further action is needed in this Bill to maintain public trust in how health data is managed for individual care and systemwide scientific development.
I realise that we have covered a vast range of issues, but I want to touch briefly on those raised by the noble Baroness, Lady Kidron. She is right that, in particular, applications of risk assessments by public bodies should be maintained, and we agree with her that Article 35’s privacy-by-design requirements should be retained. She once again highlighted the downgrading of children’s rights in this Bill, whether by accident or intent, and we look forward to seeing the exchange of letters with the Minister on this. I hope that we will all be copied in and that the Minister will take on board the widespread view that we should have more engagement on this before Report, because there are so many outstanding issues to be resolved. I look forward to the Minister’s response.
I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.
The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.
However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.
One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.
The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.
That is a very interesting question, but I am not sure that there is a read-across between the AI Act and our approach here. The fundamental starting point was that, although the provisions of the original GDPR are extremely important, the burdens of compliance were not proportionate to the results. The overall foundation of the DPDI is, while at least maintaining existing levels of protection, to reduce the burdens of demonstrating or complying with that regulation. That is the thrust of it—that is what we are trying to achieve—but noble Lords will have different views about how successful we are being at either of those. It is an attempt to make it easier to be safe and to comply with the regulations of the DPDI and the other Acts that govern data protection. That is where we are coming from and the thrust of what we are trying to achieve.
I note that, as we have previously discussed, children need particular protection when organisations are collecting and processing their personal data.
I did not interrupt before because I thought that the Minister would say more about the difference between high-risk and low-risk processing, but he is going on to talk about children. One of my points was about the request from the Information Commissioner—it is very unusual for him to intervene. He said that a list of high-risk processing activities should be set out in the Bill. I do not know whether the Minister was going to address that important point.
I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.
My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.
I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.
Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.
Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.
However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.
I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.
By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.
Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?
No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.
The Minister mentioned the freedom to choose the best solution. Would it be possible for someone to be told that their contact was someone who spoke a different language to them? Do they have to be able to communicate properly with the data subjects in this country?
Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.
I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.
The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.
Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.
I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.
The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.
Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.
The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.
The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.
Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?
Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.
Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.
The Minister has reached his 20 minutes. We nudged him at 15 minutes.
My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.
As long as that applies to us on occasion as well.
I apologise for going over. I will try to be as quick as possible.
I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.
Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.
Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.
Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.
Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe
“how the controller proposes to mitigate those risks”.
I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.
My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.
With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.
At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.
I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.
Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.
My Lords, just in passing, I will say that I am beginning to feel that the decision made by the Privileges Committee and now the House is beginning to creak in terms of the very first Grand Committee that it has encountered. So, in terms of time limits, I think flexibility on Grand Committee in particular is absolutely crucial. I am afraid that the current procedures will not necessarily stand the test of time—but we shall see.
This is a relatively short debate on whether Clause 19 should stand part, but it is a really significant clause, and it is another non-trust-engendering provision. This basically takes away the duty of the police to provide justification for why they are consulting or sharing personal data. Prompted by the National AIDS Trust, we believe that the Bill must retain the duty on police forces to justify why they have accessed an individual’s personal data.
This clause removes an important check on police processing of an individual’s personal data. The NAT has been involved in cases of people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and within the wider communities that they serve. Therefore, ensuring that police officers justify why they have accessed an individual’s personal data is vital evidence in cases of police misconduct. Such cases include when a person’s HIV status is shared inappropriately by the police, or when it is not relevant to an investigation of criminal activity.
The noble Baroness, Lady Kidron, was extremely eloquent in her winding up of the last group. The Minister really needs to come back and tell us what on earth the motivation is behind this particular Clause 19. I beg to move that this clause should not stand part of the Bill.
As the noble Lord, Lord Clement-Jones, explained, his intention to oppose the question that Clause 19 stands part seeks to retain the status quo. As I read Section 62 of the Data Protection Act 2016, it obliges competent authorities to keep logs of their processing activities, whether they be for collection, alteration, consultation, disclosure, combination or the erasing of personal data. The primary purpose is for self-monitoring purposes, largely linked to disciplinary proceedings, as the noble Lord said, where an officer has become a suspect by virtue of inappropriately accessing PNC-held data.
Clause 19 removes the requirement for a competent authority to record a justification in the logs only when consulting or disclosing personal data. The Explanatory Note to the Bill explains this change as follows:
“It is … technologically challenging for systems to automatically record the justification without manual input”.
That is not a sufficiently strong reason for removing the requirement, not least because the remaining requirements of Section 62 of the Data Protection Act 2018 relating to the logs of consultation and disclosure activity will be retained and include the need to record the date and time and the identity of the person accessing the log. Presumably they will be able to be manually input, so why remove the one piece of data that might, in an investigation of abuse or misuse of the system, be useful in terms of evidence and self-incrimination? I do not understand the logic behind that at all.
I rather think the noble Lord, Lord Clement-Jones, has an important point. He has linked it to those who have been unfortunate enough to be AIDS sufferers, and I am sure that there are other people who have become victims where cases would be brought forward. I am not convinced that the clause should stand part, and we support the noble Lord in seeking its deletion.
This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.
The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.
In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.
Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.
I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.
I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.
This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.
The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.
I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.
My Lords, I am somewhat disappointed to be talking to these amendments in the dying hours of our Committee before we take a break because many noble Lords—indeed, many people outside the House—have contacted me about them. I particularly want to record the regret of the noble Lord, Lord Black, who is a signatory to these amendments, that he is unable to be with us today.
The battle between rights-holders and the tech sector is nothing new. Many noble Lords will remember the arrival and demise of file-sharing platform Napster and the subsequent settlement between the sector and the giant creative industries. Napster argued that it was merely providing a platform for users to share files and was not responsible for the actions of its users; the courts sided with the music industry, and Napster was ordered to shut down its operations in 2001. The “mere conduit” argument was debunked two decades ago. To the frustration of many of us, the lawsuits led to a perverse outcome that violent bullying or sexually explicit content would be left up for days, weeks or forever, while a birthday video with the temerity to have music in the background would be deleted almost immediately.
The emergence of the large language models—LLMs—and the desire on the part of LLM developers to scrape the open web to capture as much text, data and images as possible raise some of the same issues. The scale of scraping is, by their own admission, unprecedented, and their hunger for data at any cost in an arms race for AI dominance is publicly acknowledged, setting up a tension between the companies that want the data and data subjects and creative rights holders. A data controller who publishes personal data as part of a news story, for example, may do so on the basis of an exemption under data protection law for journalism, only for that data to be scraped and commingled with other data scraped from the open web to train an LLM.
This raises issues of copyright infringement and, more importantly—whether for individuals, creative communities or businesses that depend on the value of what they produce—these scraping activities happen invisibly. Anonymous bots acting on behalf of AI developers, or conducting a scrape as a potential supplier to AI developers, are scraping websites without notifying data controllers or data subjects. In doing so, they are also silent on whether processes are in place to minimise risks or balance competing interests, as required by current data law.
Amendment 103 would address those risks by requiring documentation and transparency. Proposed new paragraph (e) would require an AI developer to document how the data controller will enforce purpose limitation. This is essential, given that invisible data processing enabled through web scraping can pick up material that is published for a legitimate purpose, such as journalism, but the combination of such information with other data accessed through invisible data processing could change the purpose and application of that data in ways that the individual may wish to object to using their existing data rights. Proposed new paragraph (f) would require a data processor seeking to use legitimate interest as the basis for web scraping and invisible processing to build LLMs to document evidence of how they have ensured that individual information rights have been enabled at the point of collection and after processing.
Together, those proposed new paragraphs would mean that anyone who scrapes web data must be able to show that the data subjects have meaningful control and can access their information rights ahead of processing. These would be mandatory, unless they have incorporated an easily accessible machine-readable protocol on an opt-in basis, which is then the subject of Amendment 104.
Amendment 104 would require web scrapers to establish an easily accessible machine-readable protocol that works on an opt-in basis rather than the current opt-out. Undoubtedly, the words “easily”, “accessible”, “machine readable” and “web protocols” would all benefit from guidance from the ICO but, for the absence of doubt, the intention of the amendment is that a web scraper would proactively notify individuals and website owners that scraping of their data will take place, including stating the identity of the data processor and the purpose for which that data is to be scraped. In addition, the data processor will provide information on how data subjects and data controllers can exercise their information rights to opt out of their data being scraped before any such scraping takes place, with an option to object after the event if taken without permission.
We are in a situation in which not only is IP being taken at scale, potentially impoverishing our very valuable creative industries, journalism and academic work that is then regurgitated inaccurately, but which is making a mockery of individual data rights. In its recent consultation into the lawful basis for web scraping, the ICO determined that use of web-scraped data
“can be feasible if generative AI developers take their legal obligations seriously and can evidence and demonstrate this in practice”.
These amendments would operationalise that demonstration. As it stands, there is routine failure, particularly regarding new models. For example, the ICO’s preliminary enforcement notice against Snap is that its risk assessment for its AI tool was inadequate.
Noble Lords will appreciate the significance of the connection that the ICO draws between innovative technology and children’s personal data, given the heightened data rights and protections that children are afforded under the age-appropriate design code. While I welcome the ICO’s action, holders of intellectual copyright have been left to fend for themselves, since government talks have failed and individual data subjects are left exposed. Whether it is the scraping of social media or work and school websites, these will not be pursued by the ICO because regulating action in such small increments is disproportionate, yet this lack of compliance is happening at scale.
My Lords, given the hour, I will be brief. That was an absolute tour de force by the noble Baroness. As with all the Minister’s speeches, I will read her speech over Easter.
I was very interested to be reminded of the history of Napster, because that was when many of us realised that we were, in many ways, entering the digital age in the creative industries and beyond. The amendments that the noble Baroness put forward are examples of where the Bill could make a positive impact, unlike the impact that so much of the rest of it is making in watering down rights. She described cogently how large language models are ingesting or scraping data from the internet, social media and journalism, how very close to the ingestion of copyright material this whole agenda is and how it is being done by anonymous bots in particular. It fits very well with the debate in which the Minister was involved last Friday on the Private Member’s Bill of the noble Lord, Lord Holmes, who inserted a clause requiring transparency on the ingestion or scraping of data and copyright material by large language models. It is very interesting.
The opportunity in the data area is currently much greater than it is in the intellectual property area. At least we have the ICO, which is a regulator, unlike the IPO, which is not really a regulator with teeth. I am very interested in the fact that the ICO is conducting a consultation on generative AI and data protection, which it launched in January. Conterminously with this Bill, perhaps the ICO might come to some conclusions that we can use. That would of course include the whole area of biometrics, which, in the light of things such as deepfakes and so on, is increasingly an issue of great concern. The watchword is “transparency”: we must impose a duty on the generative AI models about the use of the material that they use to train their models and then use in operation. I fully support Amendments 103 and 104 in the name of the noble Baroness, even though, as she describes them, they are a small step.
My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.
As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.
What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.
It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.
I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?
In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.
I thank the noble Baroness, Lady Kidron, for tabling these amendments. I absolutely recognise their intent. I understand that they are motivated by a concern about invisible types of processing or repurposing of data when it may not be clear to people how their data is being used or how they can exercise their rights in respect of the data.
On the specific points raised by noble Lords about intellectual property rather than personal data, I note that, in their response to the AI White Paper consultation, the Government committed soon to provide a public update on their approach to AI and intellectual property, noting the importance of greater transparency in the use of copyrighted material to train models, as well as labelling and attribution of outputs.
Amendment 103 would amend the risk-assessment provisions in Clause 20 so that any assessment of high-risk processing would always include an assessment of how the data controller would comply with the purpose limitation principle and how any new processing activity would be designed so that people could exercise their rights in respect of the data at the time it was collected and at any subsequent occasion.
I respectfully submit that this amendment is not necessary. The existing provisions in Clause 20, on risk assessments, already require controllers to assess the potential risks their processing activities pose to individuals and to describe how those risks would be mitigated. This would clearly include any risk that the proposed processing activities would not comply with the data protection principles—for example, because they lacked transparency—and would make it impossible for people to exercise their rights.
Similarly, any assessment of risk would need to take account of any risks related to difficulties in complying with the purpose limitation principle—for example, if the organisation had no way of limiting who the data would be shared with as a result of the proposed processing activity.
According to draft ICO guidance on generative AI, the legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR can be a valid lawful ground for training generative AI models on web-scrape data, but only when the model’s developer can ensure that they pass the three-part test—that is, they identify a legitimate interest, demonstrate that the processing is necessary for that purpose and demonstrate that the individual’s interests do not override the interest being pursued by the controller.
Controllers must consider the balancing test particularly carefully when they do not or cannot exercise meaningful control over the use of the model. The draft guidance further notes that it would be very difficult for data controllers to carry out their processing activities in reliance on the legitimate interests lawful ground if those considerations were not taken into account.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for their support. I will read the Minister’s speech, because this is a somewhat technical matter. I am not entirely sure that I agree with what he said, but I am also not sure that I could disagree with it adequately in the moment.
I will make two general points, however. First, I hear the Minister loud and clear on the question of the Government’s announcement on AI and IP but, at the beginning of my speech, I referenced Napster and how we ended up with personal data. The big guys won the battle for copyright, so we will see the likes of the New York Times, EMI and so on winning this battle, but small creatives and individuals will not be protected. I hope that, when that announcement comes, it includes the personal data issue as well.
Secondly, I say to the Minister that, if it is working now in the way he outlined from the ICO, then I do not think anybody thinks it is working very well. Either the ICO needs to do something, or we need to do something in this Bill. If not, we are letting all our data be taken for free to build the new world with no permission.
I know that the noble Viscount is interested in this area. It is one in which we could be creative. I suggest that we try to solve the conundrum about whether the ICO is not doing its work or we are not doing ours. I beg leave to withdraw my amendment.
My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.
Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.
I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.
We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.
We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.
In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.
Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.
We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.
Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.
In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.
I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.
My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.
Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.
My Lords, I too can be brief, having heard the Minister’s response. I thought he half-shot the Clement-Jones fox, with very good aim on the Minister’s part.
I was simply going to say that it is one in a sea of amendments from the Government, but the noble Lord, Lord Clement-Jones, made an important point about making sure that the country organisations that the commissioner looks at should meet the test of data adequacy—I also had that in my speaking note. The noble Lord, Lord Clement-Jones, was making a good point in terms of ensuring that appropriate data protections are in place internationally for us to be able to work with.
The Minister explained the government amendments with some care, but I wonder if he could explain how data transfers are made to an overseas processor using the powers relied on by reference to new Section 73(4)(aa) of the 2018 Act. The power is used as a condition and justification for several of the noble Lord’s amendments, and I wonder whether he has had to table these amendments because of the original drafting. That would seem to be to be the most likely reason.
I thank the noble Lord, Lord Clement-Jones, for his amendment and his response, and I thank the noble Lord, Lord Bassam. The mechanism for monitoring international transfers was intended to be the subject for the next group in any case, and I would have hoped to give a full answer. I know we are all deeply disappointed that it looks as if we may not get to that group but, if the noble Lord is not willing to wait until we have that debate, I am very happy to write.
My Lords, this may be a convenient moment for the Committee to adjourn. Happy Easter, everyone.
(7 months, 3 weeks ago)
Grand CommitteeOnce more unto the breach, my Lords—as opposed to “my friends”.
I will also speak to Amendments 112 to 114, 116 and 130. New Article 45B(2) lists conditions that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects. It replaces the existing conditions in Article 45(2)(a) to (c) of the UK GDPR, removing important considerations such as the impact of a third country’s laws and practices in relation to national security, defence, public security, criminal law and public authority access to personal data on the level of protection provided to UK data subjects.
Despite this shorter list of conditions to consider, the Secretary of State is none the less required to be satisfied that a third country provides a level of protection that is not materially lower than the UK’s. It is plain that such an assessment cannot be made without considering the impact of these factors on the level of protection for UK data in a third country. It is therefore unclear why the amendment that the Government have made to Article 45 is necessary, beyond a desire for the Government to draw attention away from such contentious and complicated issues.
It may be that through rewriting Article 45 of the UK GDPR, the Government’s intention is that assimilated case law on international data transfers is no longer relevant. If that is the case, that would be a substantial risk for UK data adequacy. Importantly, new Article 45B(2) removes the reference to the need for an independent data protection regulator in the relevant jurisdiction. This, sadly, is consistent with the theme of diminishing the independence of the ICO, which is one of the major concerns in relation to the Bill, and it is also an area where the European Commission has expressed concern. The independence of the regulator is a key part of the EU data adequacy regime and is explicitly referenced in Article 8 of the Charter of Fundamental Rights, which guarantees the right to protection of personal data. Amendment 111 restores the original considerations that the Secretary of State must take into account.
Amendments 112 and 113 would remove the proposed powers in Schedules 5 and 6 of the Secretary of State to assess other countries’ suitability for international transfers of data, and place these on the new information commission instead. In the specific context of HIV—the provenance of these amendments is in the National AIDS Trust’s suggestions—it is unlikely that the Secretary of State or their departmental officials will have the specialist knowledge to assess whether there is a risk of harm to an individual by transferring data related to their HIV status to a third country. Given that the activities of government departments are political by their nature, the Secretary of State making these decisions related to the suitability of transfer to third countries may not be viewed as objective by individuals whose personal data is transferred. Many people living with HIV feel comfortable reporting breaches of data protection law in relation to their HIV status to the Information Commissioner’s Office due to its position as an independent regulator, so the National AIDS Trust and others recommend that the Bill places these regulatory powers on the new information commission created by the Bill instead, as this may inspire greater public confidence.
As regards Amendment 114, paragraph 5 of Schedule 5 should contain additional provisions to mandate annual review of the data protection test for each third country to which data is transferred internationally to ensure that the data protection regime in that third country is secure and that people’s personal data, such as their HIV status, will not be shared inappropriately. HIV is criminalised in many countries around the world, and the transfer to these countries of personal data such as an individual’s HIV status could put an individual living with HIV, their partner or their family members at real risk of harm. This is because HIV stigma is incredibly pronounced in many countries, which fosters a real risk of HIV-related violence. Amendment 114 would mandate this annual review.
As regards Amendment 116, new Article 47A(4) to (7) gives the Secretary of State a broad regulation-making power to designate new transfer mechanisms for personal data being sent to a third country in the absence of adequacy regulations. Controllers would be able to rely on these new mechanisms, alongside the existing mechanisms in Article 46 of the UK GDPR, to transfer data abroad. In order to designate new mechanisms, which could be based on mechanisms used in other jurisdictions, the Secretary of State must be satisfied that these are
“capable of securing that the data protection test set out in Article 46 is met”.
The Secretary of State must be satisfied that the transfer mechanism is capable of providing a level of protection for data subjects that is not materially lower than under the UK GDPR and the Data Protection Act. The Government have described this new regulation-making power as a way to future-proof the UK’s GDPR international transfers regime, but they have not been able to point to any transfer mechanisms in other countries that might be suitable to be recognised in UK law, and nor have they set out examples of how new transfer mechanisms might be created.
In addition to not having a clear rationale to take the power, it is not clear how the Secretary of State could be satisfied that a new mechanism is capable of providing the appropriate level of protection for data subjects. This test is meant to be a lower standard than the test for controllers seeking to rely on a transfer mechanism to transfer overseas, which requires them to consider that the mechanism provides the appropriate level of protection. It is not clear to us how the Secretary of State could be satisfied of a mechanism’s capability without having a clear sense of how it would be used by controllers in reality. That is the reason for Amendment 116.
As regards Amendment 130, Ministers have continued all the adequacy decisions that the EU had made in respect of third countries when the UK stopped being subject to EU treaties. The UK also conferred data adequacy on the EEA, but all this was done on a transitional basis. The Bill now seeks to continue those adequacy decisions, but no analysis appears to have been carried out as to whether these jurisdictions confer an adequate level of protection of personal data. This is not consistent with Section 17B(1) of the DPA 2018, which states that the Secretary of State must carry out a review of whether the relevant country that has been granted data adequacy continues to ensure an adequate level of protection, and that these reviews must be carried out at intervals of not more than four years.
In the EU, litigants have twice brought successful challenges against adequacy decisions. Those decisions were deemed unlawful and quashed by the European Court of Justice. It appears that this sort of challenge would not be possible in the UK because the adequacy decisions are being continued by the Bill and therefore through primary legislation. Any challenge to these adequacy decisions could result only in a declaration of incompatibility under the Human Rights Act; it could not be quashed by the UK courts. This is another example of how leaving the EU has diminished the rights of UK citizens compared with their EU counterparts.
As well as tabling those amendments, I support and have signed Amendment 115 in the names of the noble Lords, Lord Bethell and Lord Kirkhope, and I look forward to hearing their arguments in relation to it. In the meantime, I beg to move.
My Lords, I rise with some temerity. This is my first visit to this Committee to speak. I have popped in before and have been following it very carefully. The work going on here is enormously important.
I am speaking to Amendment 115, thanks to the indulgence of my noble friend Lord Bethell, who is the lead name on that amendment but has kindly suggested that I start the discussions. I also thank the noble Lord, Lord Clement-Jones, for his support. Amendment 115 has one clear objective and that is to prevent transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The word “credible” is important in this amendment.
I thank my noble friend the Minister for his letter of 11 April, which he sent to us to try to mop up a number of issues. In particular, in one paragraph he referred to the question of adequacy, which may also touch on what the noble Lord, Lord Clement-Jones, has just said. The Secretary of State’s powers are also referred to, but I must ask: how, in a fast-moving or unique situation, can all the factors referred to in this long and comprehensive paragraph be considered?
The mechanisms of government and government departments must be thorough and in place to satisfactorily discharge what are, I think, somewhat grand intentions. I say that from a personal point of view, because I was one of those who drafted the European GDPR—another reason I am interested in discussing these matters today—and I was responsible for the adequacy decisions with third countries. The word “adequacy” matters very much in this group, in the same way that we were unable to use “adequacy” when we dealt with the United States and had to look at “equivalence”. Adequacy can work only if one is working to similar parameters. If one is constitutionally looking at different parameters, as is the case in the United States, then the word “equivalence” becomes much more relevant, because, although things cannot be quite the same in the way in which administration or regulation is carried out, if you have an equivalence situation, that can be acceptable and lead to an understanding of the adequacy which we are looking for in terms of others being involved.
I have a marvellous note here, which I am sure noble Lords have already talked about. It says that every day we generate 181 zettabytes of personal data. I am sure noble Lords are all aware of zettabytes, but I will clarify. One zettabyte is 1,000 exabytes—which perhaps makes it simpler to understand—or, if you like, 1 billion trillion bytes. One’s mind just has to get around this, but this is data on our movements, finances, health and families, from our cameras, phones, doorbells and, I am afraid, even from our refrigerators—though Lady Kirkhope refuses point blank to have any kind of detector on her fridge door that will tell anybody anything about us or what we eat. Increasingly, it is also data from our cars. Our every moment is recorded—information relating to everything from shopping preferences to personal fitness to our anxieties, even, as they are displayed or discussed. It is stored by companies that we entrust with that data and we have a right to expect that such sensitive and private data will be protected. Indeed, one of the core principles of data protection, as we all know, is accountability.
Article 79 of the UK GDPR and Section 167 of our Data Protection Act 2018 provide that UK users must have the right to effective judicial remedy in the event of a data protection breach. Article 79 says that
“each data subject shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation”.
My Lords, I will speak to Amendment 115 in my name. I start by saying a huge thanks to the noble Lord, Lord Clement-Jones, and my noble friend Lord Kirkhope, who have put everything so well and persuasively that I have almost nothing else to say in support. I am looking forward to the Minister throwing in the towel and accepting all the measures as suggested. Noble Lords have really landed it well.
I shall not go through the principle behind my amendment because, frankly, its benefit is so self-evident and clear that it does not need to be rehearsed in great detail. What I want to get across is the absolute and paramount urgency of the Government adopting this measure or a similar one. This is a terrific Bill; I thank the Minister for all the work that he and his team have done on it. I sat through Second Reading, although I did not speak on that day, when the Minister gave a persuasive account of the Bill; we are grateful for that.
However, this is a massive gap. It is a huge lacuna in the provisions of a Bill called a data protection Bill. It is a well-known gap in British legislation—and, by the way, in the legislation of lots of other countries. We could try to wait for an international settlement—some kind of Bretton Woods of data—where all the countries of the world put their heads together and try to hammer out an international agreement on data. That would be a wonderful thing but there is no prospect whatever of it in sight, so the time has come for countries to start looking at their own unilateral arrangements on the international transfer of data.
We have sought to duck this commitment by stringing together a Heath Robinson set of arrangements around transfer risk arrestments and bilateral agreements with countries. This has worked to some extent—at least to the extent that there is a booming industry around data. We should not diminish that achievement but there are massive gaps and huge liabilities in that arrangement, as my noble friend Lord Kirkhope rightly described, particularly now that we are living in a new, polarised world where countries of concern deliberately seek to harvest our data for their own security needs.
There are three reasons why this has become not just a chronic issue that could perhaps be kicked down the road a bit but an acute issue that should be dealt with immediately in the Bill’s provisions. The first, which my noble friend hinted at, is the massive flood of new data coming our way. I had the privilege of having a look at a BYD car. It was absolutely awesome and, by the way, phenomenally cheap; if the Chinese taxpayer is okay with subsidising our cars, I would highly recommend them to everyone here. One feature of the car is a camera on the dashboard that looks straight at the driver’s face, including their emotional resonance; for instance, if you look weary, it will prompt you to stop and have a coffee. That is a lovely feature but it is also mapping your face for hours and hours every year and, potentially, conveying that information to the algorithmic artificial intelligence run by the CCP in China—something that causes me huge personal concern. Lady Kirkhope may be worried about her fridge but I am very worried about my potential car. I embrace the huge global growth of data exchanges and technology’s benefits for citizens, taxpayers and voters, but this must be done in a well-curated field. The internet of things, which, as many noble Lords will know, was invented by Charlie Parsons, is another aspect of this.
Secondly, the kind of data being exchanged is becoming increasingly sensitive. I have mentioned the video in the BYD car; genomics data is another area of grave concern. I have an associate fellowship at King’s College London’s Department of War Studies, looking specifically at bioweapons and the transfer of genomic data. Some of this is on the horizon; it is not of immediate use from a strategic and national security point of view today but the idea that there could be, as in a James Bond film, some way of targeting individuals with poisons based on their genomic make-up is not beyond imagination.
The idea that you could create generalised bioweapons around genomics or seek to influence people based in part on insight derived from their genomic information is definitely on the horizon. We know that because China is doing some of this already; in the west of China, it is able to identify members of the Uighur tribes. In fact, China can say to someone, “We’re calling you up because we know that you’re the cousin of someone who is in prison today”, and this has happened. How does China know that? It has done it through the genomic tracking in its databases. China’s domestic use of data, through the social checking of genomic data and financial transactions, is a very clear precedent for the kinds of things that could be applied to the data that we are sharing with such countries.
Thirdly, there is the sensitivity of what uses the data is being put to. The geopolitics of the world are changing considerably. We now have what the Americans call countries of concern that are going out of their way to harvest and collect data on our populations. It is a stated element of their national mission to acquire data that could be used for national security purposes. These are today’s rivals but, potentially, tomorrow’s enemies.
For those three reasons, I very much urge the Minister to think about ways in which provisions on the international transfer of data could be added to the Bill. Other countries are certainly looking at the same; on 28 February this year, President Biden issued executive order 14117, which in many ways echoes the themes of our Amendment 115. It says clearly that there is an “unacceptable risk” to US national security from the large sharing of data across borders and asks the DoJ to publish a “countries of concern” list. That list has already been published and the countries on it are as the Committee would expect. It also seeks to define priority data. In other words, it is a proportionate, thoughtful and sensible set of measures to try to bring some kind of guard-rail to an industry where data transfer is clearly of grave concern to Americans. It looks particularly at genomic and financial transaction data but it has the capacity to be a little broader.
I urge the Minister to consider that this is now the time for unilateral action by the British Government. As my noble friend Lord Kirkhope said, if we do not do that, we may find ourselves being left behind by the EU, including the Irish, by the Americans and so on. There is an important spill-over effect from Britain acting sensibly that will do something to inspire and prod others into action. It is totally inappropriate to continue this pretence that British citizens are having their data suitably protected by the kind of commercial contracts that they are signing, which have no kind of redress or legal standing in the country of destination.
Lastly, the commercial point is very important. For those of us who seek to champion an open, global internet and a free flow of data while facilitating investment in that important trade, we must curate and care for it in a way that instils trust and responsibility, otherwise the whole thing will be blown up and people will start pulling wires out of the back of machines.
My Lords, I am very grateful to the noble Lords, Lord Clement-Jones, Lord Bethell and Lord Kirkhope, for tabling these amendments and for enabling us to have a good debate on the robustness of the proposed international data rules, which are set out in Schedules 5 and 7. Incidentally, I do not share the enthusiasm expressed by the noble Lord, Lord Bethell, for the rest of the Bill, but on this issue we are in agreement—and perhaps the other issues are for debate some other time.
I welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.
I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.
Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.
I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.
Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.
We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.
Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?
The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.
The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?
I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.
I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.
I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.
On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.
Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.
My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.
My Lords, in a way the Minister is acknowledging that there is a watering down taking place, yet the Government seem fairly relaxed about seeing these issues. If something happens, the Government will do something or other, or the commissioner will. But the Government are proposing to water down Article 45, and that is the essence of what we are all talking about here. We are not satisfied with the current position, and watering down Article 45 will make it even worse; there will be more Yandexes.
The Minister mentioned prosecutions and legal redress in the UK from international data transfer breaches. Can he share some examples of that, maybe by letter? I am not aware of that being something with a long precedent.
A number of important points were raised there. Yes, of course I will share—
I am sorry to interrupt my noble friend, but the point I made—this now follows on from other remarks—was that these requirements have been in place for a long time, and we are seeing abuses. Therefore, I was hoping that my noble friend would be able to offer changes in the Bill that would put more emphasis on dealing with these breaches. Otherwise, as has been said, we look as though we are going backwards, not forwards.
As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.
Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.
I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.
Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.
For these reasons, I hope noble Lords will not press their amendments.
My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.
By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.
The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?
The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.
Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.
My Lords, the issue of access to data for researchers is very familiar to all those involved in debates on the Online Safety Bill, now an Act. The issue is relatively simple and I am not going to spell it out in great detail. I will leave it to others to give more concrete examples.
The issue is that in the tech industry, there is a vast amount of data about the effect of social media and the impact on consumers of the technologies, algorithms and content that are in circulation. But there is a blackout when it comes to academics, epidemiologists, journalists or even parliamentarians who are trying to have a dig around to understand what is happening. What is happening on extremism or child safety? What is happening with fraud or to our national security? What is the impact on children of hours and hours spent on YouTube, Facebook, Snapchat and all the other technologies that are now consuming billions of hours of our time?
In other walks of life, such as the finance and retail sectors, there are open platforms where regulators, researchers and even the public can have a peek at what is going on inside. This is not commercial access; instead, it is trying to understand the impact on society and individuals of these very important and influential technologies. That kind of transparency absolutely underpins trust in these systems. The data is essential to policy-making and the surveillance is key to security.
What I want to convey is a sense that there is a very straightforward solution to this. There is a precedent, already being rolled out in the EU, that creates a good framework. Amendment 135 has been thoroughly discussed with the department in previous debates on the Online Safety Bill, and I thank the Minister and the Secretary of State for a number of meetings with parliamentarians and civil society groups to go through it. The idea of creating a data access pathway that has attached to it a clear validation system that secures the independence and privacy of researchers is relatively straightforward. Oversight by the ICO is something that we all agree gives it a sense of credibility and straightforwardness.
I want to try to convey to the Minister the importance of moving on this, because it has been discussed over several years. The regulator is certainly a supporter of the principle: Melanie Dawes, the CEO of Ofcom, gave testimony during the Joint Committee on the Online Safety Bill in which she said it was one of the things she felt was weak about that Bill. She would like to have seen it strengthened up. It was therefore disappointing that there was not a chance to do that then, but there is a chance to do it now.
During the passage of the Online Safety Act, the Minister also made commitments from the Dispatch Box about returning to this subject during the passage of this Bill, so it feels like a good moment to be discussing this. There are 40 impressive civic society groups that have written in clear terms about the need for this, so there is a wide body of opinion in support. One reason why it is so urgent that we get this measure in the Bill—and do not kick the can down the road—is that it is currently getting harder and harder for researchers, academics and scientists to look into the impact of the actions of our technology companies.
Twitter/X has withdrawn almost all access to the kind of data that makes this research possible. Facebook has announced that it will be stopping the support of CrowdTangle, the very important facility it had created, which had become a very useful tool. The feedback from the Meta live content library that is its theoretical replacement has not been very positive; it is a clunky and awkward tool to use. TikTok is a total black box and we have no idea what is going on in there; and the action by Elon Musk against the Center for Countering Digital Hate, which he pursued in the courts over its analysis of data, gives a sense of the very aggressive tone from tech companies towards researchers who are trying to do what is widely considered to be very important work.
My Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:
“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]
When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.
I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.
I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.
In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:
“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.
Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.
First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.
Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.
Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.
Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to
“provide access to data to vetted researchers”
seeking to carry out
“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.
It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.
Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.
I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.
Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.
I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?
Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.
My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.
As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.
When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.
The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?
My Lords, I am grateful to the noble Lord, Lord Bethell, and his cosignatories for bringing this comprehensive amendment before us this afternoon. As we have heard, this is an issue that was debated at length in the Online Safety Act. It is, in effect, unfinished business. I pay tribute to the noble Lords who shepherded that Bill through the House so effectively. It is important that we tie up the ends of all the issues. The noble Lord made significant progress, but those issues that remain unresolved come, quite rightly, before us now, and this Bill is an appropriate vehicle for resolving those outstanding issues.
As has been said, the heart of the problem is that tech companies are hugely protective of the data they hold. They are reluctant to share it or to give any insight on how their data is farmed and stored. They get to decide what access is given, even when there are potentially illegal consequences, and they get to judge the risk levels of their actions without any independent oversight.
During the course of the Online Safety Bill, the issue was raised not only by noble Lords but by a range of respected academics and organisations representing civil society. They supported the cross-party initiative from Peers calling for more independent research, democratic oversight and accountability into online safety issues. In particular, as we have heard, colleagues identified a real need for approved researchers to check the risks of non-compliance in the regulated sectors of UK law by large tech companies—particularly those with large numbers of children accessing the services. This arose because of the increasing anecdotal evidence that children’s rights were being ignored or exploited. The noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, have given an excellent exposition of the potential and real harms that continue to be identified by the lack of regulatory action on these issues.
Like other noble Lords, I welcome this amendment. It is well-crafted, takes a holistic approach to the problem, makes the responsibilities of the large tech companies clear and establishes a systematic research base of vetted researchers to check compliance. It also creates important criteria for the authorisation of those vetted researchers: the research must be in the public interest, must be transparent, must be carried out by respected researchers, and must be free from commercial interests so that companies cannot mark their own homework. As has been said, it mirrors the provisions in the EU Digital Services Act and ensures comparable research opportunities. That is an opportunity for the UK to maintain its status as one of the top places in the world for expertise on the impact of online harms.
Since the Online Safety Act was passed, the Information Commissioner has been carrying out further work on the children’s code of practice. The latest update report says:
“There has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms”.
That is all well and good but the ICO and other regulators are still reliant on the information provided by the tech companies on how their data is used and stored and how they mitigate risk. Their responsibilities would be made much easier if they had access to properly approved and vetted independent research information that could inform their decisions.
I am grateful to noble Lords for tabling this amendment. I hope that the Minister hears its urgency and necessity and that he can assure us that the Government intend to table a similar amendment on Report—as the noble Baroness, Lady Kidron, said, no more “wait and see”. The time has come to stop talking about this issue and take action. Like the noble Lord, Lord Clement-Jones, I was in awe of the questions that the noble Baroness came up with and do not envy the Minister in trying to answer them all. She asked whether, if necessary, it could be done via a letter but I think that the time has come on this and some other issues to roll up our sleeves, get round the table and thrash it out. We have waited too long for a solution and I am not sure that exchanges of letters will progress this in the way we would hope. I hope that the Minister will agree to convene some meetings of interested parties—maybe then we will make some real progress.
My Lords, as ever, many thanks to all noble Lords who spoke in the debate.
Amendment 135, tabled by my noble friend Lord Bethell, would enable researchers to access data from data controllers and processors in relation to systemic risks to the UK and non-compliance with regulatory law. The regime would be overseen by the ICO. Let me take this opportunity to thank both my noble friend for the ongoing discussions we have had and the honourable Members in the other place who are also interested in this measure.
Following debates during the passage of the Online Safety Act, the Government have been undertaking further work in relation to access to data for online safety researchers. This work is ongoing and, as my noble friend Lord Bethell will be aware, the Government are having ongoing conversations on this issue. As he knows, the online safety regime is very broad and covers issues that have an impact on national security and fraud. I intend to write to the Committee with an update on this matter, setting out our progress ahead of Report, which should move us forward.
While we recognise the benefits of improving researchers’ access to data—for example, using data to better understand the impact of social media on users—this is a highly complex issue with several risks that are not currently well understood. Further analysis has reiterated the complexities of the issue. My noble friend will agree that it is vital that we get this right and that any policy interventions are grounded in the evidence base. For example, there are risks in relation to personal data protection, user consent and the disclosure of commercially sensitive information. Introducing a framework to give researchers access to data without better understanding these risks could have significant consequences for data security and commercially sensitive information, and could potentially destabilise any data access regime as it is implemented.
In the meantime, the Online Safety Act will improve the information available to researchers by empowering Ofcom to require major providers to publish a broad range of online safety information through annual transparency reports. Ofcom will also be able to appoint a skilled person to undertake a report to assess compliance or to develop its understanding of the risk of non-compliance and how to mitigate it. This may include the appointment of independent researchers as skilled persons. Further, Ofcom is required to conduct research into online harms and has the power to require companies to provide information to support this research activity.
Moving on to the amendment specifically, it is significantly broader than online safety and the EU’s parallel Digital Services Act regime. Any data controllers and processors would be in scope if they have more than 1 million UK users or customers, if there is a large concentration of child users or if the service is high-risk. This would include not just social media platforms but any organisation, including those in financial services, broadcasting and telecoms as well as any other large businesses. Although we are carefully considering international approaches to this issue, it is worth noting that much of the detail about how the data access provisions in the Digital Services Act will work in practice is yet to be determined. Any policy interventions in this space should be predicated on a robust evidence base, which we are in the process of developing.
The amendment would also enable researchers to access data to research systemic risks to compliance with any UK regulatory law that is upheld by the ICO, Ofcom, the Competition and Markets Authority, and the Financial Conduct Authority. The benefits and risks of such a broad regime are not understood and are likely to vary across sectors. It is also likely to be inappropriate for the ICO to be the sole regulator tasked with vetting researchers across the remits of the other regulators. The ICO may not have the necessary expertise to make this determination about areas of law that it does not regulate.
Ofcom already has the power to gather information that it requires for the purpose of exercising its online safety functions. This power applies to companies in scope of the duties and, where necessary, to other organisations or persons who may have relevant information. Ofcom can also issue information request notices to overseas companies as well as to UK-based companies. The amendment is also not clear about the different types of information that a researcher may want to access. It refers to a data controller and processors—concepts that relate to the processing of personal data under data protection law—yet researchers may also be interested in other kinds of data, such as information about a service’s systems and processes.
Although the Government continue to consider this issue—I look forward to setting out our progress between now and Report—for the reasons I have set out, I am not able to accept this amendment. I will certainly write to the Committee on this matter and to the noble Baroness, Lady Kidron, with a more detailed response to her questions—there were more than four of them, I think—in particular those about Ofcom.
Perhaps I could encourage the Minister to say at least whether he is concerned that a lack of evidence might be impacting on the codes and powers that we have given to Ofcom in order to create the regime. I share his slight regret that Ofcom does not have this provision that is in front of us. It may be that more than one regulator needs access to research data but it is the independents that we are talking about. We are not talking about Ofcom doing things and the ICO doing things. We are talking about independent researchers doing things so that the evidence exists. I would like to hear just a little concern that the regime is suffering from a lack of evidence.
I am thinking very carefully about how best to answer. Yes, I do share that concern. I will set this out in more detail when I write to the noble Baroness and will place that letter in the House of Lords Library. In the meantime, I hope that my noble friend will withdraw his amendment.
I am enormously grateful to the Minister for his response. However, it falls short of my hopes. Obviously, I have not seen the letter that he is going to send us, but I hope that the department will have taken on board the commitments made by previous Ministers during discussions on the Online Safety Bill and the very clear evidence that the situation is getting worse, not better.
Any hope that the tech companies would somehow have heard the debate in the House of Lords and that it would have occurred to them that they needed to step up to their responsibilities has, I am afraid, been dashed by their behaviours in the last 18 months. We have seen a serious withdrawal of existing data-sharing provisions. As we approach even more use of AI, the excitement of the metaverse, a massive escalation in the amount of data and the impact of their technologies on society, it is extremely sobering to think that there is almost no access to the black box of their data.
That was a very good conclusion to the response from the noble Lord, Lord Bethell—urging a Minister to lean in. I have not heard that expression used in the House before, but it is excellent because, faced with a Home Office Minister, I am sure that is the kind of behaviour that we can expect imminently.
Last time we debated issues relating to national security and data protection, the noble Lord, Lord Ashton, was the responsible Minister and I had the support of the noble Lord, Lord Paddick. Now I have the Minister all to myself on Amendments 135A to 135E and the stand part notices on Clauses 28 to 30. These Benches believe that, as drafted, these clauses fall foul of the UK’s obligations under the ECHR, because they give the Home Secretary too broad a discretion and do not create sufficient safeguards to prevent their misuse.
Under the case law of the European Court of Human Rights, laws that give unfettered or overly broad discretion to the Government to interfere with privacy will violate the convention, because the laws must be sufficiently specific to prevent abuses of power. This means they must make sure that, any time they interfere with the privacy of people in the UK, they obey the law, have a goal that is legitimate in a democratic society and do only what is truly necessary to achieving that goal. The court has repeatedly stressed that this is what the rule of law means; it is an essential principle of democracy.
Despite multiple requests from MPs, and from Rights and Security International in particular, the Government have also failed to explain why they believe that these clauses are necessary to safeguard national security. So far, they have explained only why these new powers would be “helpful” or would ensure “greater efficiency”. Those justifications do not meet the standard that the ECHR requires when the Government want to interfere with our privacy. They are not entitled to do just anything that they find helpful.
Under Clause 28(7), the Home Secretary would be able to issue a national security certificate to tell the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey. For instance, a national security certificate would give the police immunity when they commit crimes by using personal data illegally. It would also exempt them from certain provisions of the Freedom of Information Act 2000. The Bill would expand what counts as an intelligence service for the purposes of data protection law—again, at the Home Secretary’s wish. Clause 29 would allow the Home Secretary to issue a designation notice, allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018, otherwise designed for the intelligence agencies whenever they collaborate with the security services.
Both the amended approach to national security certificates and the new designation notice regime would be unaccountable. The courts would not be able to review what the Government are doing and Parliament might therefore never find out. National security certificates are unchallengeable before the courts, meaning that the police and the Home Secretary would be unaccountable if they abused those powers. If the Home Secretary says that the police need to use these increased—and, in our view, unnecessary—powers in relation to national security, his word will be final. This includes the power to commit crimes.
As regards designation notices, the Home Secretary is responsible for approving and reviewing their use. Only a person who is directly affected by a designation notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, in which case how could anybody know that the police had been snooping on their lives under this law?
Clauses 28 to 30 could, in our view, further violate the UK’s obligations under the Human Rights Act 1998 and the European Convention on Human Rights because they remove the courts’ role in reviewing how the Government use their surveillance power. The European Court of Human Rights has ruled in the past that large aspects of the law previously governing the UK’s surveillance powers were unlawful because they gave the Government too much discretion and lacked important safeguards to prevent misuse. Clauses 28 to 30 could be challenged on similar grounds, and the court has shown that it is willing to rule on these issues. These weaknesses in the law could also harm important relationships that the UK has with the EU as regards data adequacy, a subject that we will no doubt discuss in further depth later this week.
The Government argue that the clauses create a simplified legal framework that would improve the efficiency of police operations when working with the intelligence services. This is far from meeting the necessity standard under the ECHR.
The Government have frequently used the Fishmongers’ Hall and Manchester Arena attacks to support the idea that Clauses 28 to 30 are desirable. However, a difference in data protection regimes was not the issue in either case; instead, the problem centred around failures in offender management, along with a lack of communication between the intelligence services and local police. The Government have not explained how Clauses 28 to 30 would have prevented either incident or why they think these clauses are necessary to prevent whatever forms of violence the Government regard as most likely to occur in the future. The Government have had sufficient opportunity to date to explain the rationale for these clauses, yet they have so far failed to do so. For these reasons, we are of the view that Clauses 28 to 30 should not stand part of the Bill.
However, it is also worth putting down amendments to try to tease out additional aspects of these clauses, so Amendments 135A and 135D would put proportionality back in. It is not clear why the word “proportionality” has been taken out of the existing legislation. Similarly, Amendment 135B attempts to put back in the principles that should underpin decisions. Those are the most troubling changes, since they seem to allow for departure from basic data protection principles. These were the principles that the Government, during the passage of the Data Protection Act 2018, assured Parliament would always be secure. The noble Lord, Lord Ashton of Hyde, said:
“People will always have the right to ensure that the data held about them is fair and accurate, and consistent with the data protection principles”.—[Official Report, 10/10/17; col. 126.]
Thirdly, on the introduction of oversight by a judicial commissioner for Clause 28 certificates, now seems a good time to do that. During the passage of the Data Protection Act through Parliament, there was much debate over the Part 2 national security exemption for general processing in Section 26 and the national security certificates in Section 27. We expressed concern then but, sadly, the judicial commissioner role was not included. This is a timely moment to suggest that again.
Finally, on increasing the oversight of the Information Commissioner under Amendment 135E, I hope that this will be an opportunity for the Minister, despite the fact that I would prefer to see Clauses 28 to 30 not form part of the Bill, to explain in greater detail why they are constructed in the way they are and why the Home Office believes that it needs to amend the legislation in the way it proposes. I beg to move.
My Lords, I come to this topic rather late and without the star quality in this area that has today been attributed to the noble Lord, Lord Kirkhope. I acknowledge both the work of Justice in helping me to understand what Clause 28 does and the work of the noble Lord, Lord Clement-Jones, in formulating the probing amendments in this group. I echo his questions on Clause 28. I will focus on a few specific matters.
First, what is the difference between the existing formulation for restricting data protection rights “when necessary and proportionate” to protect national security and the new formulation,
“when required to safeguard national security”?
What is the purpose of that change? Does “required” mean the same as “necessary” or something different? Do the restrictions not need to be proportionate any more? If so, why? Could we have a practical example of what the change is likely to mean in practice?
Secondly, why is it necessary to expand the number of rights and obligations from which competent law enforcement authorities can be exempted for reasons of national security? I can understand why it may for national security reasons be necessary to restrict a person’s right to be informed, right of access to data or right to be notified of a data breach, as under the existing law, but Clause 28 would allow the disapplication of some very basic principles of data protection law—including, as I understand it, the right to have your data processed only for a specified, explicit and legitimate purpose, as well as the right to have decisions made about you not use solely automated methods.
Thirdly, as the noble Lord, Lord Clement-Jones, asked, why is it necessary to remove the powers of the Information Commissioner to investigate, to enter and inspect, and, where necessary, to issue notices? I appreciate that certificates will remain appealable to the Upper Tribunal by the person directly affected, applying judicial review principles, but that is surely not a substitute for review by the skilled and experienced ICO. Apart from anything else, the subject is unlikely even to know that they have been affected by the provisions, given that a certificate would exempt law enforcement from having to provide information to them. That is precisely why the oversight of a commissioner in the national security area is so important.
As for Clauses 29 and 30, I am as keen as anybody to improve the capabilities for the joint processing of data by the police and intelligence agencies. That was a major theme of the learning points from the London and Manchester attacks of 2017, which I helped to formulate in that year and on which I reported publicly in 2019. A joint processing regime certainly sounds like a good idea in principle but I would be grateful if the Minister could confirm which law enforcement competent authorities will be subject to this new regime. Are they limited to Counter Terrorism Policing and the National Crime Agency?
My Lords, we have heard some fine words from the noble Lord, Lord Clement-Jones, in putting the case for his Amendments 135A, 135B, 135C and 135D, which are grouped with the clause stand part debates. As he explained, they seek to test and probe why the Government have sought to extend the ability of the security and intelligence services to disapply basic data protection principles.
The new Government-drafted clause essentially, as well as disapplying current provisions, disapplies the rights of data subjects and the obligations placed on competent authorities and processors. The Explanatory Notes say that this is to create a regime that
“ensures that there is consistency in approach”.
Section 29 is designed to facilitate joint processing by the various agencies with a common regime. Like the noble Lord, Lord Anderson, I well understand why they might want to do that. The noble Lord, Lord Clement-Jones, has done the Committee a service in tabling these amendments because, as he said, during the passage of the 2018 Act assurances were given that law enforcement would always abide by basic data protection principles. On the face of it, that assurance no longer applies. Is this because it is inconvenient for the security and intelligence services? What are the Government seeking to do here?
Can the Minister explain from the Government’s perspective what has changed since 2018 that has led Ministers to conclude that those critical principles should be compromised? The amendments also seek to assert the importance of proportionality considerations when deciding whether national security exemptions apply. This principle is again raised in relation to the issuing of a national security certificate.
The noble Lord, Lord Clement-Jones, with Amendment 135E effectively poses the question of where the balance of oversight should rest. Should it be with the Secretary of State or the commissioner? All that new Clause 29 does is oblige the Secretary of State to consult the commissioner with the expectation that the commissioner then makes public a record of designation orders. However, it strips out quite a lot of the commissioner’s current roles and responsibilities. We should surely have something more convincing than that to guarantee transparency in the process. We on these Benches will take some convincing that the Government have got the right balance in regard to the interests of national security and the security services. Why, for instance, is Parliament being sidelined in the exercise of the Secretary of State’s powers? Did Ministers give any consideration to reporting duties and obligations so far as Parliament is concerned? If not, why not?
Labour does not want to see national security compromised in any way, nor do we want to undermine the essential and vital work that our intelligence services have to perform to protect us all. However, we must also ensure that we build confidence in our security and intelligence services by making them properly accountable, as the noble Lord, Lord Clement-Jones, argued, and that the checks and balances are sufficient and the right ones.
The noble Lord, Lord Anderson, got it right in questioning the change of language, and I want to better understand from the Minister what that really means. But why extend the range of exemptions? We could do with some specific reasons as to why that is being changed and why that is the case. Why has the Information Commissioner’s role been so fundamentally changed with regard to these clauses and the exemptions?
We will, as always, listen carefully to the Minister’s reply before we give further thought to this framework on Report, but we are very unhappy with the changes that are taking away some of the fundamental protections that were in place before, and we will need quite a lot of convincing on these government changes.
My Lords, I thank the noble Lord, Lord Clement-Jones, for his amendments and thank the other noble Lords who spoke in this short debate. These amendments seek to remove Clauses 28, 29 and 30 in their entirety, or, as an alternative, to make amendments to Clauses 28 and 29. I will first speak to Clause 28, and if I fail to answer any questions I will of course guarantee to write.
Clause 28 replaces the current provision under the law enforcement regime for the protection of national security data, with a revised version that mirrors the existing exemptions available to organisations operating under the UK GDPR and intelligence services regimes. It is also similar to what was available to law enforcement agencies under the 1998 Data Protection Act. It is essential that law enforcement agencies can properly protect data where required for national security reasons, and they should certainly be able to apply the same protections that are available to other organisations.
The noble Lord, Lord Clement-Jones, asked whether the exemption was in breach of a person’s Article 8 rights, but the national security exemption will permit law enforcement agencies to apply an exemption to the need to comply with certain parts of the law enforcement data protection regime, such as the data protection principles or the rights of the data subject. It is not a blanket exemption and it will be able to be applied only where this is required for the purposes of safeguarding national security—for instance, in order to prevent the tipping-off of a terror suspect. It can be applied only on a case-by-case basis. We do not, therefore, believe that the exemption breaches the right to privacy.
In terms of the Government taking away the right to lodge a complaint with the commissioner, that is not the case—the Government are not removing that right. Those rights are being consolidated under Clause 44 of this DPDI Bill. We are omitting Article 77 as Clause 44 will introduce provisions that allow a data subject to lodge a complaint with a controller.
In terms of how the subject themselves will know how to complain to the Information Commissioner, all organisations, including law enforcement agencies, are required to provide certain information to individuals, including their right to make a complaint to the Information Commissioner and, where applicable, the contact details of the organisation’s data protection officer or, in line with other amendments under the Bill, the organisation’s senior responsible individual, if they suspect that their personal information is being process unlawfully.
Amendments 135A and 135D seek to introduce a proportionality test in relation to the application of the national security exemption and the issuing of a ministerial certificate for law enforcement agencies operating under Part 3 of the Data Protection Act. The approach we propose is consistent with the similar exemptions for the UK GDPR and intelligence services, which all require a controller to evaluate on a case-by-case basis whether an exemption from a provision is required for the purpose of safeguarding national security.
Amendment 135B will remove the ability for law enforcement agencies to apply the national security exemption to data protection principles, whereas the approach we propose is consistent with the other data protection regimes and will provide for exemption from the data protection principles in Chapter 2—where required and on a case-by-case basis—but not from the requirement for processing to be lawful and the safeguards which apply to sensitive data.
The ability to disapply certain principles laid out in Chapter 2 is crucial for the efficacy of the national security exemption. This is evident in the UK GDPR and Part 4 exemption which disapplies similar principles. To remove the ability to apply the national security exemption to any of the data protection principles for law enforcement agencies only would undermine their ability to offer the same protections as those processing under the other data protection regimes.
Not all the principles laid out in Chapter 2 can be exempted from; for example, law enforcement agencies are still required to ensure that all processing is lawful and cannot exempt from the safeguards that apply to sensitive data. There are safeguards in place to ensure that the exemption is used correctly by law enforcement agencies. Where a data subject feels that the national security exemption has not been applied correctly, the legislation allows them to complain to the Information Commissioner and, ultimately, to the courts. Additionally, the reforms require law enforcement agencies to appoint a senior responsible individual whose tasks include monitoring compliance with the legislation.
Amendment 135C would make it a mandatory requirement for a certificate to be sought from and approved by a judicial commissioner whenever the national security exemption is to be invoked by law enforcement agencies only. This bureaucratic process does not apply to organisations processing under the other data protection regimes; forcing law enforcement agencies to apply for a certificate every time they need to apply the exemption would be unworkable as it would remove their ability to act quickly in relation to matters of national security. For these reasons, I hope that the noble Lord, Lord Clement-Jones, will not press his amendments.
On Clauses 29 and 30 of the Bill, currently, only the intelligence services can operate under Part 4 of the Data Protection Act. This means that, even when working together, the intelligence services and law enforcement cannot work on a single shared dataset but must instead transfer data back and forth, applying the provisions of their applicable data protection regimes, which creates significant friction. Removing barriers to joint working was flagged as a recommendation following the Manchester Arena inquiry, as was noted by the noble Lord, Lord Anderson, and following Fishmongers’ Hall, which also recommended closer working.
Clauses 29 and 30 enable qualifying competent authorities and an intelligence service jointly to process data under a single data protection regime in authorised, specific circumstances to safeguard national security. In order to jointly process data in this manner, the Secretary of State must issue a designation notice to authorise it. A notice can be granted only if the Secretary of State is satisfied that the processing is required for the purpose of safeguarding national security and following consultation with the ICO.
Amendment 135E would make the ICO the final arbiter of whether a designation notice is granted by requiring it to—
May I just intrude on the Minister’s flow? As I understand it, there is a possibility that relatives of the families affected by the Manchester Arena bombing will take to court matters relating to the operation of the security services, including relating to intelligence that it is felt they may have had prior to the bombing. How will this new regime, as set out in the Bill, affect the rights of those who may seek to hold the security services to account in the courts? Will their legal advisers ever be able to discover materials that might otherwise be exempt from public view?
That is a very good question but the noble Lord will understand that I am somewhat reluctant to pontificate about a potential forthcoming court case. I cannot really answer the question, I am afraid.
But understanding the impact on people’s rights is important in the context of this legislation.
As I say, it is a good question but I cannot comment further on that one. I will see whether there is anything that we can commit to in writing and have a further chat about this subject but I will leave it for now, if I may.
Amendment 135E would make the ICO the final arbiter of whether a designation notice is granted by requiring it to judge whether the notice is required for the purposes of the safeguarding of national security. It would be wholly inappropriate for the ICO to act as a judge of national security; that is not a function of the ICO in its capacity as regulator and should be reserved to the Secretary of State. As is generally the case with decisions by public bodies, the decision of the Secretary of State to grant a designation notice can be challenged legally; this is expressly provided for under new Section 82E, as is proposed to be included in the DPA by Clause 29.
On the subject of how a data subject is supposed to exercise their rights if they do not know that their data is being processed under a notice subject to Part 4, the ICO will publish designation notices as soon as is reasonably practical. Privacy information notices will also be updated if necessary to enable data subjects to identify a single point of contact should they wish to exercise their rights in relation to data that might be processed under a designation notice. This single point of contact will ease the process of exercising their data rights.
The noble Lord, Lord Anderson, asked which law enforcement agencies this will apply to. That will be set out separately in the subsequent affirmative SI. I cannot be more precise than that at the moment.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will be prepared to withdraw his amendment.
The Minister left us on a tantalising note. He was unable to say whether the law enforcement organisations affected by these clauses will be limited to Counter Terrorism Policing and the NCA or whether they will include others as well. I am rather at a loss to think who else might be included. Do we really have to wait for the affirmative regulations before we can be told about that? It seems pretty important. As the Minister knows well, there are quite a few precedents—following some recent ones—for extending to those bodies some of the privileges and powers that attach to the intelligence agencies. I suspect that a number of noble Lords might be quite alarmed if they felt that those powers or privileges were being extended more widely—certainly without knowing, or at least having some idea, in advance to whom they might be extended.
While I am on my feet and causing mischief for the Minister, may I return to the rather lawyerly question that I put to him? I do not think I had an answer about the formulation in new Section 78A, which talks about an exemption applying
“if exemption from the provision is required for the purposes of safeguarding national security”.
What does “required” mean? Does it simply mean the same as “necessary”—in which case, why not stick with that? Or does it mean something else? Does it mean that someone has required or requested it? It could be a pretty significant difference and this is a pretty significant ambiguity in the Bill. If the Minister is not willing to explain it now, perhaps he will feel able to write to us to explain exactly what is meant by replacing the well-worn phrase “necessary and proportionate” with “required”.
I thank the noble Lord for that. It is a lawyerly question and, as he knows, I am not a lawyer. With respect, I will endeavour to write and clarify on that point, as well as on his other good point about the sorts of authorities that we are talking about.
Perhaps the same correspondence could cover the point I raised as well.
My Lords, I am immensely grateful to the noble Lords, Lord Anderson and Lord Bassam, for their interventions. In particular, given his background, if the noble Lord, Lord Anderson, has concerns about these clauses, we all ought to have concerns. I am grateful to the Minister for the extent of his unpacking—or attempted unpacking—of these clauses but I feel that we are on a slippery slope here. I feel some considerable unease about the widening of the disapplication of principles that we were assured were immutable only six years ago. I am worried about that.
We have had some reassurance about the right to transparency, perhaps when it is convenient that data subjects find out about what is happening. The right to challenge was also mentioned by the Minister but he has not really answered the question about whether the Home Office has looked seriously at the implications as far as the human rights convention is concerned, which is the reason for the stand part notice. The Minister did not address that matter at all; I do not know why. I am assuming that the Home Office has looked at the clauses in the light of the convention but, again, he did not talk about that.
The only assurance the Minister has really given is that it is all on a case-by-case basis. I do not think that that is much of a reassurance. On the proportionality point made by the noble Lord, Lord Anderson, I think that we are going to be agog in waiting for the Minister’s correspondence on that, but it is such a basic issue. There were two amendments specifically on proportionality but we have not really had a reply on that issue at all, in terms of why it should have been eliminated by the legislation. So a feeling of unease prevails. I do not even feel that the Minister has unpacked fully the issue of joint working; I think that the noble Lord, Lord Anderson, did that more. We need to know more about how that will operate.
The final point that the Minister made gave even greater concern—to think that there will be an SI setting out the bodies that will have the powers. We are probably slightly wiser than when we started out with this group of amendments, but only slightly and we are considerably more concerned. In the meantime, I beg leave to withdraw the amendment.
My Lords, I shall speak to Amendment 137 in my name. I apologise to the Committee that I was unable to speak in the Second Reading debate on this Bill, which seems a long time ago now.
This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. The amendment was originally tabled in the House of Commons by Jane Hunt MP; both of us would like to thank the Police Federation of England and Wales for its assistance in briefing us in preparing the draft clause.
Perhaps it would be helpful to say by way of background that the existing data protection legislation requires our police forces to spend huge amounts of time and resources, first, in going through the information that has been gathered by investigating officers to identify every single item of personal data contained in that information; secondly, in deciding whether it is necessary or, in many cases, strictly necessary for the CPS to consider each item of personal data when making a charging decision; and, thirdly, in redacting every item of personal data that does not meet this test. I ask noble Lords to imagine, with things such as body cams being worn by the police, how much personal data is being collected these days every time officers respond to incidents. The police federation and the National Police Chiefs’ Council estimate that the national cost of this redaction exercise is approximately £5,642,900 per annum and that, since 1 January 2021, 365,000 policing hours have been consumed with this redaction exercise.
In his Budget last month, the Chancellor of the Exchequer asked for ideas to improve public sector productivity, so it will come as no surprise to the Minister that the Police Federation has rushed to submit this idea as one of those suggestions about how we might improve that productivity puzzle. I want to share one example of what this redaction requirement means in practice. This came from a detective constable in Suffolk who was attached to a regional crime unit. They said that the case they were involved with was
“a multi-million pound fraud offence from Suffolk with 115 victims. After a five year investigation two persons were charged (in Oct 2023) however, these charges would have been brought far sooner had the CPS not insisted that all used and unused material in the case be provided and redacted prior to the actual charges being brought. The redactions took six months to complete and at times both officers and civilian staff were deployed full time to accommodate”
this exercise. Due to the nature of the investigation, the victims in this case were elderly and some had, sadly, passed away over the years.
While the detective constable accepted that the investigation itself was lengthy, they
“were able to manage the expectations of the victims by providing routine updates on the progress of the case”.
However:
“It was more difficult to explain come early 2023 that documents in the case then had to be redacted before the CPS would allow us to charge the suspects. The fact that documents of varying sizes (some several pages in length) of the unused material had to be redacted prior to charge, when these documents may or not be served and ultimately would be served secondary to the used items is difficult to understand for the officers let alone explaining this to victims who are losing interest and respect for both the Police and CPS. Anyone would question why we were spending time redacting documents that MAY NEVER be served. It is … easy to say redact everything! In turn the additional months redacting affected the court process, delaying that also. Victims are questioning whether they will be alive to see”
the conclusion of the process. While the delay was
“not solely down to the redaction demands a more targeted redaction process after charge is more logical and cost effective for all”.
The redaction exercise is potentially unnecessary in the case of any given case file because the CPS decides to charge in approximately only 75% of cases. In the 25% of cases where the CPS decides not to charge, the unredacted file could simply be deleted by the CPS. Where the CPS decides to charge, the case file could then be returned to the police to carry out the redaction exercise before there is any risk of the file being disclosed to any person or body other than the CPS.
The simple and practical solution, as the Police Federation has put forward, is for the police to carry out the redaction exercise in relation to any given case file only after the CPS has taken the decision to charge. I should be clear that what is being proposed here does not remove any substantive protection of the personal data in question. It does not remove the obligation to review and redact the personal data contained in material in a case file; it simply provides for that review and redaction to be conducted by the police after, rather than before, a charging decision has been made by the CPS.
The law enforcement directive on which the relevant part of the Data Protection Act 2018 was based would have permitted this when that Act was passed. Part 3 of the 2018 Act implemented that directive and makes provision for data processing by “competent authorities”, including police forces and the Crown Prosecution Service, for defined “law enforcement purposes”. However, although recital 4 to the law enforcement directive emphasised:
“The free flow of personal data between competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences … should be facilitated while ensuring a high level of protection of personal data”,
Part 3 of the 2018 Act contains no provisions at all to facilitate the free flow of personal data between the police and the CPS.
The effect of the proposed new clause as set out in this amendment would be, first, to exempt the police from complying with the first data protection principle—except in so far as that principle requires processing to be fair—and from the third data protection principle, when the police are undertaking processing that consists of preparing for submission and submitting to the CPS a case file seeking a charging decision. Secondly, the amendment would exempt the CPS from the first and third data principles to the same extent when it makes that charging decision. Thirdly, it would require the CPS to return the case file to the police if a decision to charge is made, after which the data protection principles will apply in full to any subsequent processing.
I appreciate—particularly with the Minister here—that the Home Office is really in the driving seat here. We understand that the Home Office objections to this amendment seem to boil down to the belief that it will only partially resolve the problem, because the legal requirements around sharing of data are broader than just the first and third data principles, and that there are other relevant provisions not addressed by this drafting. It is of course absolutely open to the Minister and the Home Office to say that they support the broad principles of this draft clause, while suggesting that the drafting of this particular amendment should identify some other relevant provisions, and it would be helpful if they did that rather than just objecting to the whole amendment as put forward.
My Lords, the noble Baroness, Lady Morgan, has done us a service by raising this issue. My question is about whether the advice given to date about redaction is accurate. I have not seen the Home Office’s guidance or counsel’s analysis. I have taken advice on the Police Federation’s case—I received an email and I was very interested in what it had to say, because we all want to make sure that the bureaucracy involved in charging and dealing with the CPS is as minimal as possible within the bounds of data protection law.
Section 35(2)(b) of the Data Protection Act simply requires the police to ensure that their processing is necessary for the performance of their tasks. You would have thought that sending an investigation file to the CPS to decide whether to charge a suspect seems necessary for the performance of that task. Some of that personal data may end up not being relevant to the charge or any trial, but that is a judgment for the CPS and the prosecutor. It does not mean, in the view of those I have consulted, that the file has to be redacted at vast taxpayer cost before the CPS or prosecutor have had a chance to see the investigation’s file. When you look at sensitive data, the test is “strictly necessary”, which is a higher test, but surely the answer to that must be that officers should collect this information only where they consider it relevant to the case. So this can be dealt with through protocols about data protection, which ensure that officers do not collect more sensitive data than is necessary for the purposes of the investigation.
Similarly, under Section 37, the question that the personal data must be adequate, relevant and not excessive in relation to the purpose for which it is processed should not be interpreted in such a way that this redaction exercise is required. If an officer thinks they need to collect the relevant information for the purpose of the investigation, that seems to me—and to those advising me—in broad terms to be sufficient to comply with the principle. Conversely, if officers are collecting too much data, the answer is that they should be trained to avoid doing this. If officers really are collecting more information than they should be, redactions cannot remedy the fact that the collection was unlawful in the first place. The solution seems to be to stop them collecting that data.
I assume—maybe I am completely wrong—that the Minister will utter “suitable guidance” in response to the noble Baroness’s amendment and say that there is no need to amend the legislation, but, if there is no need to do so, I hope that they revise the guidance, because the Police Federation and its members are clearly labouring under a misapprehension about the way the Act should be interpreted. It would be quite a serious matter if that has taken place for the last six years.
My Lords, we should be very grateful to the noble Baroness, Lady Morgan of Cotes, for her amendment. I listened very carefully to her line of argument and find much that we can support in the approach. In that context, we should also thank the Police Federation of England and Wales for a particularly useful and enlightening briefing paper.
We may well be suffering under the law of unintended consequences in this context; it seems to have hit quite hard and acted as a barrier to the sensible processing and transfer of data between two parts of the law enforcement machinery. It is quite interesting coming off the back of the previous debate, when we were discussing making the transfer of information and intelligence between different agencies easier and having a common approach. It is a very relevant discussion to have.
I do not think that the legislation, when it was originally drafted, could ever have been intended to work in the way the Police Federation has set out. The implementation of the Data Protection Act 2018, in so far as law enforcement agencies are concerned, is supposed to be guided by recital 4, which the noble Baroness read into the record and which makes good sense.
As the noble Baroness explained, the Police Federation’s argument that the DPA makes no provisions at all that are designed to facilitate, in effect, the free flow of information, that it should be able to hold all the relevant data prior to the charging decision being made by the CPS, and that redaction should take place only after a decision on charging has been made seems quite a sensible approach. As she argued, it would significantly lighten the burden on police investigating teams and enable the decision on charging to be more broadly informed.
So this is a piece of simplification that we can all support. The case has been made very well. If it helps speed up charging and policing processes, which I know the Government are very concerned about, as all Governments should be, it seems a sensible move—but this is the Home Office. We do not always expect the most sensible things to be delivered by that department, but we hope that they are.
I thank all noble Lords for their contributions—I think. I thank my noble friend Lady Morgan of Cotes for her amendment and for raising what is an important issue. Amendment 137 seeks to permit the police and the Crown Prosecution Service to share unredacted data with one another when making a charging decision. Perhaps to the surprise of the noble Lord, Lord Bassam, we agree: we must reduce the burden of redaction on the police. As my noble friend noted, this is very substantial and costly.
We welcome the intent of the amendment. However, as my noble friend has noted, we do not believe that, as drafted, it would achieve the stated aim. To fully remove it would require the amendment of more than just the Data Protection Act.
However, the Government are committed to reducing the burden on the police, but it is important that we get it right and that the solution is comprehensive. We consider that the objective which my noble friend is seeking would be better achieved through other means, including improved technology and new, simplified guidance to prevent overredaction, as all speakers, including the noble Lord, Lord Clement-Jones, noted.
The Home Office provided £960,000 of funding for text and audio-visual multimedia redaction in the 2023-24 financial year. Thanks to that funding, police forces have been able to procure automated text redaction tools, the trials of which have demonstrated that they could save up 80% of the time spent by the police on this redaction. Furthermore, in the latest Budget, the Chancellor announced an additional £230 million of funding for technology to boost police productivity. This will be used to develop, test and roll out automated audio-visual redaction tools, saving thousands more hours of police time. I would say to my noble friend that, as the technology improves, we hope that the need for it to be supervised by individuals will diminish.
I can also tell your Lordships’ House that officials from the Home Office have consulted with the Information Commissioner’s Office and have agreed that a significant proportion of the burden caused by existing pre-charge redaction processes could be reduced safely and lawfully within the current data protection framework in a way that will maintain standards and protections for individuals. We are, therefore, actively working to tackle this issue in the most appropriate way by exploring how we can significantly reduce the redaction burden at the pre-charge stage through process change within the existing legislative framework. This will involve creating simplified guidance and, obviously, the use of better technology.
Is the Minister almost agreeing with some of my analysis in that case?
No, I think I was agreeing with my noble friend’s analysis.
I thank all noble Lords for their contributions. We acknowledge this particular problem and we are working to fix it. I would ask my noble friend to withdraw her amendment.
My Lords, I thank my noble friend the Minister for his response. I also thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their support. I hope that those watching from outside will be heartened by what they have heard. I think there is general agreement that this problem should be simplified, and the burden taken off policing.
I am interested to hear about redaction but, with bodycams and images, as well as the mass amount of data on items such as mobile phones, it is complicated. My noble friend the Minister mentioned that the Home Office and the Information Commissioner’s Office were consulting with each other to reduce this pre-charge redaction burden. Perhaps he could write to me, or we could have a meeting to work it out. The challenge in all this is that we have a debate in which everybody agrees and then it all slows down again. Perhaps we can keep the momentum going by continuing discussions outside, involving the Police Federation as well. For now, I beg leave to withdraw the amendment.
My Lords, I will speak also to Amendment 140 and the submissions that Clauses 32 to 35 should not stand part. These amendments are designed to clarify the statutory objective of the new information commission; increase its arm’s-length relationship with the Government; allow effective judicial scrutiny of its regulatory function; allow not-for-profit organisations to lodge representative complaints; retain the Office of the Biometrics and Surveillance Camera Commissioner; and empower the Equality and Human Rights Commission to scrutinise the new information commission. The effective supervision and enforcement of data protection and the investigation and detection of offenders are crucial to achieve deterrence, prevent violations, maintain transparency and control options for redress against data misuse.
My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.
We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.
Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.
I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.
I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.
Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.
There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.
Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:
“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.
The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.
The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:
“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.
In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.
I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.
The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.
My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling these amendments and raising important points about the Information Commissioner’s independence and authority to carry out his role efficiently. The amendments from the noble Lord, Lord Clement-Jones, range widely, and I have to say that I have more sympathy with some of them than others.
I start by welcoming some of the things in the Bill—I am very pleased to be able to do this. It is important that we have an independent regulator that is properly accountable to Parliament, and this is vital for a properly functioning data protection regime. We welcome a number of the changes that have been made to the ICO’s role in the Bill. In particular, we think the move to have a board and a chief executive model, with His Majesty appointing the chair of the board, is the right way to go. We also welcome the strengthening of enforcement powers and the obligation to establish stakeholder panels to inform the content of codes of practice. The noble Baroness, Lady Kidron, also highlighted that.
However, we share the concern of the noble Lord, Lord Clement-Jones, about the Secretary of State’s requirement every three years to publish a statement of strategic priorities for the commissioner to consider, respond to and have regard to. We share his view, and that of many stakeholder groups, that this crosses the line into political involvement and exposes the ICO to unwarranted political direction and manipulation. We do not believe that this wording provides sufficient safeguards from that in its current form.
I have listened carefully to the explanation of the noble Lord, Lord Clement-Jones, of Amendment 138. I understand his concern, but we are going in a slightly different direction to him on this. We believe that the reality is that the ICO does not have the resources to investigate every complaint. He needs to apply a degree of strategic prioritisation in the public interest. I think that the original wording in the Bill, rather than the noble Lord’s amendment, achieved that objective more clearly.
Amendment 140, in the name of the noble Lord, Lord Clement-Jones, raises a significant point about businesses being given assured advice to ensure that they follow the procedures correctly, and we welcome that proposal. There is a role for leadership of the ICO in this regard. His proposal also addresses the Government’s concern that data controllers struggle to understand how they should be applying the rules. This is one of the reasons for many of the changes that we have considered up until now. I hope that the Minister will look favourably on this proposal and agree that we need to give more support to businesses in how they follow the procedures.
Finally, I have added my name to the amendment of the noble Baroness, Lady Kidron, which rightly puts a deadline on the production of any new codes of practice, and a deadline on the application of any transitional arrangements which apply in the meantime. We have started using the analogy of the codes losing their champions, and in general terms she is right. Therefore, it is useful to have a deadline, and that is important to ensure delivery. This seems eminently sensible, and I hope the Minister agrees with this too.
Amendment 150 from the noble Baroness, Lady Kidron, also requires the ICO annual report to spell out specifically the steps being taken to roll out the age-appropriate design code and to specifically uphold children’s data rights. Going back to the codes losing their champions, I am sure that the Minister got the message from the noble Baronesses, Lady Kidron and Lady Harding, that in this particular case, this is not going to happen, and that this code and the drive to deliver it will be with us for some time to come.
The noble Baroness, Lady Kidron, raised concerns about the approach of the ICO, which need to be addressed. We do not want a short-term approach but a longer-term approach, and we want some guarantees that the ICO is going to address some of the bigger issues that are being raised by the age-appropriate design code and other codes. Given the huge interest in the application of children’s data rights in this and other Bills, I am sure that the Information Commissioner will want to focus his report on his achievements in this space. Nevertheless, for the avoidance of doubt, it is useful to have it in the Bill as a specific obligation, and I hope the Minister agrees with the proposal.
We have a patchwork of amendments here. I am strongly in support of some; on others, perhaps the noble Lord and I can debate further outside this Room. In the meantime, I am interested to hear what the Minister has to say.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Kidron, and other noble Lords who have tabled and signed amendments in this group. I also observe what a pleasure it is to be on a Committee with Batman and Robin—which I was not expecting to say, and which may be Hansard’s first mention of those two.
The reforms to the Information Commissioner’s Office within the Bill introduce a strategic framework of objectives and duties to provide context and clarity on the commissioner’s overarching objectives. The reforms also put best regulatory practice on to a statutory footing and bring the ICO’s responsibilities into line with that of other regulators.
With regard to Amendment 138, the principal objective upholds data protection in an outcomes-focused manner that highlights the discretion of the Information Commissioner in securing those objectives, while reinforcing the primacy of data protection. The requirement to promote trust and confidence in the use of data will encourage innovation across current and emerging technologies.
I turn now to the question of Clause 32 standing part. As part of our further reforms, the Secretary of State can prepare a statement of strategic priorities for data protection, which positions these aims within its wider policy agenda, thereby giving the commissioner helpful context for its activities. While the commissioner must take the statement into account when carrying out functions, they are not required to act in accordance with it. This means that the statement will not be used in a way to direct what the commissioner may and may not do when carrying out their functions.
Turning to Amendment 140, we believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. This amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without necessarily full knowledge of the facts, undermining their regulatory enforcement role.
In response to the amendments concerning Clauses 33 to 35 standing part, I can say that we are introducing a series of measures to increase accountability, robustness and transparency in the codes of practice process, while safeguarding the Information Commissioner’s role. The requirements for impact assessments and panel of experts mean that the codes will consider the application to, and impact on, all potential use cases. Given that the codes will have the force of law, the Secretary of State must have the ability to give her or his comments. The Information Commissioner is required to consider but not to act on those comments, preserving the commissioner’s independence. It remains for Parliament to give approval for any statutory code produced.
Amendments 142 and 143 impose a requirement on the ICO to prepare codes and for the Secretary of State to lay them in Parliament as quickly as practicable. They also limit the time that transitional provisions can be in place to a maximum of 12 months. This could mean that drafting processes are truncated or valid concerns are overlooked to hit a statutory deadline, rather than the codes being considered properly to reflect the relevant perspectives.
Given the importance of ensuring that any new codes are robust, comprehensive and considered, we do not consider imposing time limits on the production of codes to be a useful tool.
Finally, Amendment 150—
We had this debate during the passage of the Online Safety Act. In the end, we all agreed—the House, including the Government, came to the view—that two and a half years, which is 18 months plus a transition period, was an almost egregious amount of time considering the rate at which the digital world moves. So, to consider that more than two and a half years might be required seems a little bit strange.
I absolutely recognise the need for speed, and my noble friend Lady Harding made this point very powerfully as well, but what we are trying to do is juggle that need with the need to go through the process properly to design these things well. Let me take it away and think about it more, to make sure that we have the right balancing point. I very much see the need; it is a question of the machinery that produces the right outcome in the right timing.
Before the Minister sits down, I would very much welcome a meeting, as the noble Baroness, Lady Harding, suggested. I do not think it is useful for me to keep standing up and saying, “You are watering down the code”, and for the Minister to stand up and say, “Oh no, we’re not”. We are not in panto here, we are in Parliament, and it would be a fantastic use of all our time to sit down and work it out. I would like to believe that the Government are committed to data protection for children, because they have brought forward important legislation in this area. I would also like to believe that the Government are proud of a piece of legislation that has spread so far and wide—and been so impactful—and that they would not want to undermine it. On that basis, I ask the Minister to accede to the noble Baroness’s request.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.
We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.
I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.
My Lords, Amendment 146 is in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones; I thank them all for their support. Before I set out the amendment that would provide a code of practice for edtech and why it is so urgently required, I thank the noble Baroness, Lady Barran, and officials in the Department for Education for their engagement on this issue. I hope the Minister can approach this issue with the same desire they have shown to fill the gap that it seeks to address.
A child does not have a choice about whether they go to school. For those who do not fall into the minority who are homeschooled or who, for a reason of health or development, fall outside the education system, it is compulsory. The reason I make this point at the outset is that, if school is compulsory, it must follow that a child should enjoy the same level of privacy and safety at school as they do in any other environment. Yet we have allowed a gap in our data legislation, meaning that a child’s data is unprotected at school and, at the same time, invested in an unregulated and uncertified edtech market to develop promises of learning outcomes that range from unsubstantiated to false.
Schools are keen to adopt new technologies and say that they feel pressure to do so. In both cases, they lack the knowledge and time to assess the privacy and safety risks of the technology products that they are being sold. Amendment 146 would enable children and schools to benefit from emerging technologies. It would reduce the burden on schools in ensuring compliance so that they can get on with the job of teaching our children in a safe, developmentally appropriate and rights-respecting environment, and it would deal with companies that fail to provide evidence for their products and routinely exploit the complexity of data protection law to children’s detriment. In sum, the amendment brings forward a code of conduct for edtech.
Subsections (1) and (2) would require the ICO to bring forward a data code for edtech and tech used in education settings. In doing so, the commissioner would be required to consider children’s fundamental rights, as set out in the Convention on the Rights of the Child, and their relevance to the digital world, as adopted by the Committee on the Rights of the Child in general comment 25 in 2021. The commissioner would have to consider the fact that children are legally entitled to a higher standard of protection in respect to their personal data than adults. In keeping with other data codes, the amendment also sets out whom the ICO must consult when preparing the code, including children, parents and teachers, as well as edtech companies.
Subsection (3) would require edtech companies to provide schools with transparent information about their data-processing practices and their impact on children. This is of particular importance because the department’s own consultation showed that schools are struggling to understand the implications of being a data controller and most often accept the default settings of products and services. Having a code of conduct would allow the Information Commissioner not only to set the standards in subsections (1) and (2) but to insist on the way that information is given in order to support schools to make the right choices for their pupils.
Subsection (4) would allow schools to use edtech providers’ adherence to the code as proof of fulfilling their own data protection duties. Once again, this would alleviate the burden on teachers and school leaders.
Subsection (5) would simply give the commissioner a role in supporting a certification scheme to enable the industry to demonstrate both the compliance of edtech services and products with the UK GDPR and conformity with the age-appropriate design code of practice and the edtech code of practice. The IEEE Standards Association and For Humanity have published certification standards for the AADC but they have not yet been approved by the ICO or UKAS standards. Subsection (5) would act as a catalyst, ensuring that the ICO and the certification partners work together efficiently. Ultimately, schools will respond better to certification than to pure data law.
If the edtech sector was formally in scope of the AADC and it was robustly applied, that would do some, though not all, of what the amendment seeks to do. But in 2018, Her Majesty’s Government, as they were then, made the decision that schools are responsible for children and that the AADC would be confusing. I am not sure whether the Government of the day did not understand the AADC. It requires companies to offer children privacy by design and default. Nothing in the code would have infringed—or will infringe—on a school’s safeguarding duties, but leaving schools out of scope leaves teachers or school data protection officers with vast responsibilities for wilfully leaky products that simply should not fall to them. Many in this House thought that the Government were wrong, and since then we have seen grand abuse of the gap that was created. This is an opportunity to put that error right.
My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.
I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.
We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.
In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.
The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?
My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.
I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.
As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.
From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.
My Lords, I have added my name to Amendment 146 in the name of the noble Baroness, Lady Kidron, and I thank all noble Lords who have spoken.
These days, most children learn to swipe an iPad long before they learn to ride a bike. They are accessing the internet at ever younger ages on a multitude of devices. Children are choosing to spend more time online, browsing social media, playing games and using apps. However, we also force children to spend an increasing amount of time online for their education. A growing trend over the last decade or more, this escalated during the pandemic. Screen time at home became lesson time; it was a vital educational lifeline for many in lockdown.
Like other noble Lords, I am not against edtech, but the reality is that the necessary speed of the transition meant that insufficient regard was paid to children’s rights and the data practices of edtech. The noble Baroness, Lady Kidron, as ever, has given us a catalogue of abuses of children’s data which have already taken place in schools, so there is a degree of urgency about this, and Amendment 146 seeks to rectify the situation.
One in five UK internet users are children. Schools are assessing their work online; teachers are using online resources and recording enormous amounts of sensitive data about every pupil. Edtech companies have identified that such a large and captive population is potentially profitable. This amendment reinforces that children are also a vulnerable population and that we must safeguard their data and personal information on this basis. Their rights should not be traded in as the edtech companies chase profits.
The code of practice proposed in this amendment establishes standards for companies to follow, in line with the fundamental rights and freedoms as set out in the UN Convention on the Rights of the Child. It asserts that they are entitled to a higher degree of protection than adults in the digital realm. It would oblige the commissioner to prepare a code of practice which ensures this. It underlines that consultations with individuals and organisations who have the best interests of children at heart is vital, so that the enormous edtech companies cannot bamboozle already overstretched teachers and school leaders.
In education, data has always been processed from children in school. It is necessary for the school’s functioning and to monitor the educational development of individual children. Edtech is now becoming a permanent fixture in children’s schooling and education, but it is largely untested, unregulated and unaccountable. Currently, it is impossible to know what data is collected by edtech providers and how they are using it. This blurs the boundaries between the privacy-preserving and commercial parts of services profiting from children’s data.
Why is this important? First, education data can reveal particularly sensitive and protected characteristics about children: their ethnicity, religion, disability or health status. Such data can also be used to create algorithms that profile children and predict or assess their academic ability and performance; it could reinforce prejudice, create siloed populations or entrench low expectations. Secondly, there is a risk that data-profiling children can lead to deterministic outcomes, defining too early what subjects a child is good at, how creative they are and what they are interested in. Safeguards must be put in place in relation to the processing of children’s personal data in schools to protect those fundamental rights. Thirdly, of course, is money. Data is appreciating in value, resulting in market pressure for data to be collected, processed, shared and reused. Increasingly, such data processed from children in schools is facilitated by edtech, an already major and expanding sector with a projected value of £3.4 billion.
The growth of edtech’s use in schools is promoted by the Department for Education’s edtech strategy, which sets out a vision for edtech to be an
“inseparable thread woven throughout the processes of teaching and learning”.
Yet the strategy gives little weight to data protection beyond noting the importance of preventing data breaching. Tech giants have become the biggest companies in the world because they own data on us. Schoolchildren have little choice as to their involvement with these companies in the classroom, so we have a moral duty to ensure that they are protected, not commodified or exploited, when learning. It must be a priority for the Government to keep emerging technologies in education under regular review.
Equally important is that the ICO should invest in expertise specific to the domain of education. By regularly reviewing emerging technologies—those already in use and those proposed for use—in education, and their potential risks and impacts, such experts could provide clear and timely guidance for schools to protect individual children and entire cohorts. Amendment 146 would introduce a new code of practice on the processing and use of children’s data by edtech providers. It would also ensure that edtech met their legal obligations under the law, protected children’s data and empowered schools.
I was pleased to hear that the noble Baroness, Lady Kidron, has had constructive discussions with the Education Minister, the noble Baroness, Lady Barran. The way forward on this matter is some sort of joint work between the two departments. The noble Baroness, Lady Kidron, said that she hopes the Minister today will respond with equal positivity; he could start by supporting the principles of this amendment. Beyond that, I hope that he will agree to liaise with the Department for Education and embrace the noble Baroness’s request for more meetings to discuss this issue on a joint basis.
I am grateful, as ever, to the noble Baroness, Lady Kidron, for both Amendment 146 and her continued work in championing the protection of children.
Let me start by saying that the Government strongly agree with the noble Baroness that all providers of edtech services must comply with the law when collecting and making decisions about the use of children’s data throughout the duration of their processing activities. That said, I respectfully submit that this amendment is not necessary, for the reasons I shall set out.
The ICO already has existing codes and guidance for children and has set out guidance about how the children’s code, data protection and e-privacy legislation apply to edtech providers. Although the Government recognise the value that ICO codes can have in promoting good practice and improving compliance, they do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by them.
The guidance covers broad topics, including choosing a lawful basis for the processing; rules around information society services; targeting children with marketing; profiling children or making automated decisions about them; data sharing; children’s data rights; and exemptions relating to children’s data. Separately, as we have discussed throughout this debate, the age-appropriate design code deals specifically with the provision of online services likely to be accessed by children in the UK; this includes online edtech services. I am pleased to say that the Department for Education has begun discussions with commercial specialists to look at strengthening the contractual clauses relating to the procurement of edtech resources to ensure that they comply with the standards set out in the UK GDPR and the age-appropriate design code.
On the subject of requiring the ICO to develop a report with the edtech sector, with a view to creating a certification scheme and assessing compliance and conformity with data protection, we believe that such an approach should be at the discretion of the independent regulator.
The issues that have been raised in this very good, short debate are deeply important. Edtech is an issue that the Government are considering carefully—especially the Department for Education, given the increasing time spent online for education. I note that the DPA 2018 already contains a power for the Secretary of State to request new codes of practice, which could include one on edtech if the evidence warranted it. I would be happy to return to this in future but consider the amendment unnecessary at this time. For the reasons I have set out, I am not able to accept the amendment and hope that the noble Baroness will withdraw it.
I thank everyone who spoke, particularly for making it absolutely clear that not one of us, including myself, is against edtech. We just want it to be fair and want the rules to be adequate.
I am particularly grateful to the noble Baroness, Lady Jones, for detailing what education data includes. It might feel as though it is just about someone’s exam results or something that might already be public but it can include things such as how often they go to see the nurse, what their parents’ immigration status is or whether they are late. There is a lot of information quite apart from this personalised education provision, to which the noble Baroness referred. In fact, we have a great deal of emerging evidence that it has no pedagogical background to it. There is also the question of huge investment right across the sector in things where we do not know what they are. I thank the noble Baroness for that.
As to the Minister’s response, I hope that he will forgive me for being disappointed. I am grateful to him for reminding us that the Secretary of State has that power under the DPA 2018. I would love for her to use that power but, so far, it has not been forthcoming. The evidence we saw from the freedom of information request is that the scheme the department wanted to put in place has been totally retracted—and clearly for resource reasons rather than because it is not needed. I find it quite surprising that the Minister can suggest that it is all gung ho here in the UK but that Germany, Holland, France, et cetera are being hysterical in regard to this issue. Each one of them has found it to be egregious.
Finally, the AADC applies only to internet society services; there is an exception for education. Where they are joint controllers, they are outsourcing the problems to the schools, which have no level of expertise in this and just take default settings. It is not good enough, I am afraid. I feel bound to say this: I understand the needs of parliamentary business, which puts just a handful of us in this Room to discuss things out of sight, but, if the Government are not willing to protect children’s data at school, when they are in loco parentis to our children, I am really bewildered as to what this Bill is for. Education is widely understood to be a social good but we are downgrading the data protections for children and rejecting every single positive move that anybody has made in Committee. I beg leave to withdraw my amendment but I will bring this back on Report.
(7 months, 2 weeks ago)
Grand CommitteeMy Lords, I start today with probably the most innocuous of the amendments, which is that Clause 44 should not stand part. Others are more significant, but its purpose, if one can describe it as such, is as a probing clause stand part, to see whether the Minister can explain the real motive and impact of new Section 164A, which is inserted by Clause 44. As the explanatory statement says, it appears to hinder
“data subjects’ right to lodge complaints, and extends the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the Commissioner’s response to a complaint.”
I am looking to the Minister to see whether he can unpack the reasons for that and what the impact is on data subjects’ rights.
More fundamental is Amendment 153, which relates to Clause 45. This provision inserts new Section 165A into the Data Protection Act, according to which the commissioner would have the discretion to refuse to act on a complaint if the complainant did not try to resolve the infringement of their rights with the relevant organisation and at least 45 days have passed since then. The right to an effective remedy constitutes a core element of data protection—most individuals will not pursue cases before a court, because of the lengthy, time- consuming and costly nature of judicial proceedings—and acts as a deterrent against data protection violations, in so far as victims can obtain meaningful redress. Administrative remedies are particularly useful, because they focus on addressing malpractice and obtaining meaningful changes in how personal data is handled in practice.
However, the ICO indicates that in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Moreover, avenues to challenge ICO inaction are extremely limited. Scrutiny of the information tribunal has been restricted to a purely procedural as opposed to a substantive nature. It was narrowed even further by the Administrative Court decision, which found that the ICO was not obliged to investigate each and every complaint.
Amendment 153 would remove Clause 45. The ICO already enjoys a wide margin of discretion and little accountability for how it handles complaints. In light of its poor performance, it does not seem appropriate to expand the discretion of the new information commission even further. It would also extend the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the commissioner’s response to a complaint. This would allow individuals to promote judicial scrutiny over decisions that have a fundamental impact into how laws are enforced in practice and it would increase the overall accountability of the new information commission.
We have signed Amendment 154, in the name of the noble Baroness, Lady Jones, and I look forward to hearing what she says on that. I apologise for the late tabling of Amendments 154A to 154F, which are all related to Amendments 155 and 175. Clause 47 sets out changes in procedure in the courts, in relation to the right of information of a data subject under the 2018 Act, but there are other issues that need resolving around the jurisdiction of the courts and the Upper Tribunal in data protection cases. That is the reason for tabling these amendments.
The High Court’s judgment in the Delo v ICO case held that part of the reasoning in Killock and Veale about the relative jurisdiction of the courts and tribunals was wrong. The Court of Appeal’s decision in the Delo case underlines concerns, but does not properly address the jurisdictions’ limits in Sections 166 and 167 of the 2018 Act, regarding the distinction between determining procedural failings or the merits of decisions by the ICO. Surely jurisdiction under these sections should be in either the courts or the tribunals, not both. In the view of many, including me, it should be in the tribunals. That is what these amendments seek.
It is clear from these two judgments that there was disagreement on the extent of the jurisdiction of tribunals and courts, notably between Mrs Justice Farbey and Mr Justice Mostyn. The commissioner submitted very different submissions to the Upper Tribunal, the High Court and the Court of Appeal, in relation to the extent and limits of Sections 166 and 167. It is not at all clear what Parliament’s intentions were, when passing the 2018 Act, on the extents and limits of the powers in these sections and whether the appropriate source of redress is a court or tribunal.
This has resulted in jurisdictional confusion. A large number of claims have been brought in either the courts or the tribunals, under either Section 166 or Section 167, and the respective court or tribunal has frequently ruled that the claim should have been made under the other section and it therefore does not have jurisdiction, so that the claim is struck out. The Bill offers a prime opportunity to resolve this issue.
Clause 45(5), which creates new Section 166A, would only blur the lines even more and fortify the reasoning for the claim to be put into the tribunals, rather than the courts. These amendments would give certainty to the courts and tribunals as to their powers and would be much less confusing for litigants in person, most of whom do not have the luxury of paying hundreds of thousands in court fees. This itself is another reason for this to remain in the tribunals, which do not charge fees to issue proceedings.
The proposed new clause inserted by Amendment 287 would require the Secretary of State to exercise powers under Section 190 of the 2018 Act to allow public interest organisations to raise data protection complaints on behalf of individuals generally, without the need to obtain the authorisation of each individual being represented. It would therefore implement Article 80(2) of the GDPR, which provides:
“Member States may provide that any body, organisation or association referred to in paragraph 1 of this Article, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to Article 77 and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing”.
The intention behind Article 80(2) is to allow appropriately constituted organisations to bring proceedings concerning infringements of the data protection regulations in the absence of the data subject. That is to ensure that proceedings may be brought in response to an infringement, rather than on the specific facts of an individual’s case. As a result, data subjects are, in theory, offered greater and more effective protection of their rights. Actions under Article 80(2) could address systemic infringements that arise by design, rather than requiring an individual to evidence the breaches and the specific effects to them.
At present, an affected individual—a data subject—is always required to bring a claim or complaint to a supervisory authority. Whether through direct action or under Section 187 of the 2018 Act, a data subject will have to be named and engaged. In practice, a data subject is not always identifiable or willing to bring action to address even the most egregious conduct.
Article 80(2) would fill a gap that Article 80(1) and Section 187 of the Data Protection Act are not intended to fill. Individuals can be unwilling to seek justice, exercise their rights and lodge data protection complaints on their own, either for fear of retaliation from a powerful organisation or because of the stigma that may be associated with the matter where a data protection violation occurred. Even a motivated data subject may be unwilling to take action due to the risks involved. For instance, it would be reasonable for that data subject not to want to become involved in a lengthy, costly legal process that may be disproportionate to the loss suffered or remedy available. This is particularly pressing where the infringement concerns systemic concerns rather than where an individual has suffered material or non-material damage as a result of the infringement.
Civil society organisations have long helped complainants navigate justice systems in seeking remedies in the data protection area, providing a valuable addition to the enactment of UK data protection laws. My Amendment 287 would allow public interest organisations to lodge representative complaints, even without the mandate of data subjects, to encourage the filing of well-argued, strategically important cases with the potential to improve significantly the data subject landscape as a whole. This Bill is the ideal opportunity for the Government to implement fully Article 80(2) of the GDPR from international law and plug a significant gap in the protection of UK citizens’ privacy.
In effect, this is unfinished business from our debates on the 2018 Act, when we made several attempts to persuade the Government of the merits of introducing the rights under Article 80(2). I hope that the Government will think again. These are extremely important rights and are available in many other countries governed by a similar GDPR. I beg to move.
My Lords, as a veteran of the 2018 arguments on Article 80(2), I rise in support of Amendment 287, which would see its implementation.
Understanding and exercising personal data rights is not straightforward. Even when the rights are being infringed, it is rare that an individual data subject has the time, knowledge or ability to make a complaint to the ICO. This is particularly true for vulnerable groups, including children and the elderly, disadvantaged groups and other groups of people, such as domestic abuse survivors or members of the LGBTQ community, who may have specific reasons for not identifying themselves in relation to a complaint. It is a principle in law that a right that cannot be activated is not fully given.
A data subject’s ability to claim protection is constrained by a range of factors, none of which relates to the validity of their complaint or the level of harm experienced. Rather, the vast majority are prevented from making a complaint by a lack of expertise, capacity, time and money; by the fact that they are not aware that they have data rights; or by the fact that they understand neither that their rights have been infringed nor how to make a complaint about them.
I have considerable experience of this. I remind the Committee that I am chair of the 5Rights Foundation, which has raised important and systemic issues of non-compliance with the AADC. It has done this primarily by raising concerns with the ICO, which has then undertaken around 40 investigations based on detailed submissions. However, because the information is not part of a formalised process, the ICO has no obligation to respond to the 5Rights Foundation team, the three-month time limit for complaints does not apply and, even though forensic work by the 5Rights Foundation identified the problem, its team is not consulted or updated on progress or the outcome—all of which would be possible had it submitted the information as a formal complaint. I remind the Committee that in these cases we are talking about complaints involving children.
My Lords, I listened carefully to the explanation given by the noble Lord, Lord Clement-Jones, for his stand part notice on Clause 44. I will have to read Hansard, as I may have missed something, but I am not sure I am convinced by his arguments against Clause 44 standing part. He described his stand part notice as “innocuous”, but I am concerned that if the clause were removed it would have a slightly wider implication than that.
We feel that there are some advantages to how Clause 44 is currently worded. As it stands, it simply makes it clear that data subjects have to use the internal processes to make complaints to controllers first, and then the controller has the obligation to respond without undue delay. Although this could place an extra burden on businesses to manage and reply to complaints in a timely manner, I would have thought that this was a positive step to be welcomed. It would require controllers to have clear processes in place for handling complaints; I hope that that in itself would be an incentive against their conducting the kind of unlawful processing that prompts complaints in the first place. This seems the best practice, which would apply anyway in most organisations and complaint and arbitration systems, including, perhaps, ombudsmen, which I know the noble Lord knows more about than I do these days. There should be a requirement to use the internal processes first.
The clause makes it clear that the data subject has a right to complain directly to the controller and it makes clear that the controller has an obligation to respond. Clause 45 then goes on to make a different point, which is that the commissioner has a right to refuse to act on certain complaints. We touched on this in an earlier debate. Clearly, to be in line with Clause 44, the controller would have to have finished handling the case within the allotted time. We agree with that process. However, an alternative reason for the commissioner to refuse is when the complaint is “vexatious or excessive”. We have rehearsed our arguments about the interpretation of those words in previous debates on the application of subject access requests. I do not intend to repeat them here, but our concern about that wording rightly remains. What is important here is that the ICO should not be able to reject complaints simply because the complainant is distressed or angry. It is helpful that the clause states that in these circumstances,
“the Commissioner must inform the complainant”
of the reasons it is considered vexatious or excessive. It is also helpful that the clause states that this
“does not prevent the complainant from making it a complaint again”,
presumably in a way more compliant with the rules. Unlike the noble Lord, Lord Clement Jones—as I said, I will look at what he said in more detail—on balance, we are content with the wording as it stands.
On a slightly different tack, we have added our name to Amendment 154, in the name of the noble Lord, Lord Clement-Jones, and we support Amendment 287 on a similar subject. This touches on a similar principle to our previous debate on the right of data communities to raise data-breach complaints on behalf of individuals. In these amendments, we are proposing that there should be a collective right for organisations to raise data-breach complaints for individuals or groups of individuals who do not necessarily feel sufficiently empowered or confident to raise the complaints on their own behalf. There are many reasons why this reticence might occur, not least that the individuals may feel that making a complaint would put their employment on the line or that they would suffer discrimination at work in the future. We therefore believe that these amendments are important to widen people’s access to work with others to raise these complaints.
Since these amendments were tabled, we have received the letter from the Minister that addresses our earlier debate on data communities. I am pleased to see the general support for data intermediaries that he set out in his letter. We argue that a data community is a separate distinct collective body, which is different from the wider concept of data intermediaries. This seems to be an area in which the ICO could take a lead in clarifying rights and set standards. Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue.
The noble Lord, Lord Clement-Jones, has tabled a number of amendments that modify the courts and tribunals functions. I was hoping that when I stood here and listened to him, I would understand a bit more about the issues. I hope he will forgive me for not responding in detail to these arguments. I do not feel that I know enough about the legal background to the concerns but he seems to have made a clear case in clarifying whether the courts or tribunals should have jurisdiction in data protection issues.
On that basis, I hope that the Minister will also provide some clarification on these issues and I look forward to his response.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for tabling these amendments to Clauses 44 and 45, which would reform the framework for data protection complaints to the Information Commissioner.
The noble Lord, Lord Clement-Jones, has given notice of his intention to oppose Clause 44 standing part of the Bill. That would remove new provisions from the Bill that have been carefully designed to provide a more direct route to resolution for data subjects’ complaints. I should stress that these measures do not limit rights for data subjects to bring complaints forward, but instead provide a more direct route to resolution with the relevant data controller. The measures formalise current best practice, requiring the complainant to approach the relevant data controller, where appropriate, to attempt to resolve the issue prior to regulatory involvement.
The Bill creates a requirement for data controllers to facilitate the making of complaints and look into what may have gone wrong. This should, in most cases, result in a much quicker resolution of data protection-related complaints. The provisions will also have the impact of enabling the Information Commissioner to redeploy resources away from handling premature complaints where such complaints may be dealt with more effectively, in the first instance, by controllers and towards value-added regulatory activity, supporting businesses to use data lawfully and in innovative ways.
The noble Lord’s Amendment 153 seeks, in effect, to expand the scope of the Information Commissioner’s duty to investigate complaints under Section 165 of the Data Protection Act. However, that Section of the Act already provides robust redress routes, requiring the commissioner to take appropriate steps to respond to complaints and offer an outcome or conclude an investigation within a specified period.
The noble Lord raised the enforcement of the UK’s data protection framework. I can provide more context on the ICO’s approach, although noble Lords will be aware that it is enforced independently of government by the ICO; it would of course be inappropriate for me to comment on how the ICO exercises its enforcement powers. The ICO aims to be fair, proportionate and effective, focusing on areas with the highest risk and most harm, but this does not mean that it will enforce every case that crosses its books.
The Government have introduced a new requirement on the ICO—Clause 43—to publish an annual report on how it has exercised its enforcement powers, the number and nature of investigations, the enforcement powers used, how long investigations took and the outcome of the investigations that ended in that period. This will provide greater transparency and accountability in the ICO’s exercise of its enforcement powers. For these reasons, I am not able to accept these amendments.
I also thank the noble Baroness and the noble Lord for their Amendments 154 and 287 concerning Section 190 of the Data Protection Act. These amendments would require the Secretary of State to legislate to give effect to Article 80(2) of the UK GDPR to enable relevant non-profit organisations to make claims against data controllers for alleged data breaches on behalf of data subjects, without those data subjects having requested or agreeing to the claim being brought. Currently, such non-profit organisations can already pursue such actions on behalf of individuals who have granted them specific authorisation, as outlined in Article 80(1).
In 2021, following consultation, the Government concluded that there was insufficient evidence to justify implementing Article 80(2) to allow non-profit organisations to bring data protection claims without the authorisation of the people affected. The Government’s response to the consultation noted that the regulator can and does investigate complaints raised by civil society groups, even when they are not made on behalf of named individuals. The ICO’s investigations into the use of live facial recognition technology at King’s Cross station and in some supermarkets in southern England are examples of this.
I also thank the noble Baroness, Lady Kidron, for raising her concerns about the protection of children throughout the debate—indeed, throughout all the days in Committee. The existing regime already allows civil society groups to make complaints to the ICO about data-processing activities that affect children and vulnerable people. The ICO has a range of powers to investigate systemic data breaches under the current framework and is already capable of forcing data controllers to take decisive action to address non-compliance. We are strengthening its powers in this Bill. I note that only a few member states of the EU have allowed non-governmental organisations to launch actions without a mandate, in line with the possibility provided by the GDPR.
I turn now to Amendments 154A, 154B—
Before the noble Lord gets there and we move too far from Amendment 154, where does the Government’s thinking leave us regarding a group of class actions? Trade unions take up causes on behalf of their membership at large. I guess, in the issue of the Post Office and Mr Bates, not every sub-postmaster or sub-postmistress would have signed up to that class action, even though they may have ended up being beneficiaries of its effects. So where does it leave people with regard to data protection and the way that the data protection scheme operates where there might be a class action?
If the action is raised on behalf of named individuals, those named individuals have to have given consent for that. If the action is for a general class of people, those people would not have to give their explicit consent, because they are not named in the action. Article 80(2) of the GDPR said that going that further step was optional for all member states. I do not know which member states have taken it up, but a great many have not, just because of the complexities to which it gives rise.
My Lords, just so that the Minister might get a little note, I will ask a question. He has explained what is possible—what can be done—but not why the Government still resist putting Article 80(2) into effect. What is the reason for not adopting that article?
The reason was that an extensive consultation was undertaken in 2021 by the Government, and the Government concluded at that time that there was insufficient evidence to take what would necessarily be a complex step. That was largely on the grounds that class actions of this type can go forward either as long as they have the consent of any named individuals in the class action or on behalf of a group of individuals who are unnamed and not specifically raised by name within the investigation itself.
Perhaps the Minister could in due course say what evidence would help to persuade the Government to adopt the article.
I want to help the Minister. Perhaps he could give us some more detail on the nature of that consultation and the number of responses and what people said in it. It strikes me as rather important.
Fair enough. Maybe for the time being, it will satisfy the Committee if I share a copy of that consultation and what evidence was considered, if that would work.
I will turn now to Amendments 154A to 155 and Amendment 175, which propose sweeping modifications to the jurisdiction of the court and tribunal for proceedings under the Data Protection Act 2018. These amendments would have the effect of making the First-tier Tribunal and Upper Tribunal responsible for all data protection cases, transferring both ongoing and future cases out of the court system and to the relevant tribunals.
The Government of course want to ensure that proceedings for enforcement of data protection rules, including redress routes available to data subjects, are appropriate for the nature of the complaint. As the Committee will be well aware, at present there is a mixture of jurisdiction for tribunals and courts under data protection legislation, depending on the precise nature of the proceedings in question. Tribunals are indeed the appropriate venue for some data protection proceedings, and the legislation already recognises that—for example, for application by data subjects for an order requiring the ICO to progress their complaint. However, courts are generally the more appropriate venue for cases involving claims for compensation and successful parties can usually recover their costs. Courts also apply stricter rules of procedure and evidence than tribunals. That is because some cases are appropriate to fall under the jurisdiction of the tribunal, while others are more appropriate for court jurisdiction. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensatory damages for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in accordance with its strict procedural and evidential rules, where the data subject may recover their costs if successful.
As such, the Government are confident that the current system is balanced and proportionate and provides clear and effective administrative and judicial redress routes for data subjects seeking to exercise their rights.
My Lords, is the Minister saying that there is absolutely no confusion between the jurisdiction of the tribunals and the courts? That is, no court has come to a different conclusion about jurisdiction—for example, as to whether procedural matters are for tribunals and merits are for courts or vice versa. Is he saying that everything is hunky-dory and clear and that we do not need to concern ourselves with this crossover of jurisdiction?
No, as I was about to say, we need to take these issues seriously. The noble Lord raised a number of specific cases. I was unfamiliar with them at the start of the debate—
I will go away and look at those; I look forward to learning more about them. There are obvious implications in what the noble Lord said as to the most effective ways of distributing cases between courts and other channels.
For these reasons, I hope that the noble Lord will withdraw his amendment.
I am intrigued by the balance between what goes to a tribunal and what goes to the courts. I took the spirit behind the stand-part notice in the name of the noble Lord, Lord Clement-Jones, as being about finding the right place for the right case and ensuring that the wheels of justice are much more accessible. I am not entirely persuaded by what the Minister has said. It would probably help the Committee if we had a better understanding of where the cases go, how they are distributed and on what basis.
I thank the noble Lord; that is an important point. The question is: how does the Sorting Hat operate to distribute cases between the various tribunals and the court system? We believe that the courts have an important role to play in this but it is about how, in the early stages of a complaint, the case is allocated to a tribunal or a court. I can see that more detail is needed there; I would be happy to write to noble Lords.
Before we come to the end of this debate, I just want to raise something. I am grateful to the Minister for offering to bring forward the 2021 consultation on Article 80(2)—that will be interesting—but I wonder whether, as we look at the consultation and seek to understand the objections, the Government would be willing to listen to our experiences over the past two or three years. I know I said this on our previous day in Committee but there is, I hope, some point in ironing out some of the problems of the data regime that we are experiencing in action. I could bring forward a number of colleagues on that issue and on why it is a blind spot for both the ICO and the specialist organisations that are trying to bring systemic issues to its attention. It is very resource-heavy. I want a bit of goose and gander here: if we are trying to sort out some of the resourcing and administrative nightmares in dealing with the data regime, from a user perspective, perhaps a bit of kindness could be shown to that problem as well as to the problem of business.
I would be very happy to participate in that discussion, absolutely.
My Lords, I thank the Minister for his response. I have surprised myself: I have taken something positive away from the Bill.
The noble Baroness, Lady Jones, was quite right to be more positive about Clause 44 than I was. The Minister unpacked its relationship with Clause 45 well and satisfactorily. Obviously, we will read Hansard before we jump to too positive a conclusion.
On Article 80(2), I am grateful to the Minister for agreeing both to go back to the consultation and to look at the kinds of evidence that were brought forward, because this is a really important aspect for many civil society organisations. He underestimates the difficulties faced when bringing complaints of this nature. I would very much like this conversation to go forward because this issue has been quite a bone of contention; the noble Baroness, Lady Kidron, remembers that only too well. We may even have had ping-pong on the matter back in 2017. There is an appetite to keep on the case so, the more we can discuss this matter—between Committee and Report in particular—the better, because there is quite a head of steam behind it.
As far as the jurisdiction point is concerned, I think this may be the first time I have heard a Minister talk about the Sorting Hat. I was impressed: I have often compared this place to Hogwarts but the concept of using the Sorting Hat to decide whether a case goes to a tribunal or a court is a wonderful one. You would probably need artificial intelligence to do that kind of thing nowadays; that in itself is a bit of an issue because, after all, these may be elaborate amendments but, as the noble Lord, Lord Bassam, said, the case being made here is about the possibility of there being confusion and things not being clear in terms of where jurisdiction lies. It is really important that we determine whether the courts and tribunals themselves understand this and, perhaps more appropriately, whether they have differing views about it.
We need to get to grips with this; the more the Minister can dig into it, and into Delo, Killock and so on, the better. We are all in the foothills here but I am certainly not going to try to unpack those two judgments and the differences between Mrs Justice Farbey and Mr Justice Mostyn, which are well beyond my competency. I thank the Minister.
My Lords, the UK has rightly moved away from the EU concept of supremacy, under which retained EU law would always take precedence over domestic law when they were in conflict. That is clearly unacceptable now that we have left the EU. However, we understand that the effective functioning of our data protection legislation is of critical importance and it is appropriate for us to specify the appropriate relationship between UK and EU-derived pieces of legislation following implementation of the Retained EU Law (Revocation and Reform) Act, or REUL. That is why I am introducing a number of specific government amendments to ensure that the hierarchy of legislation works in the data protection context. These are Amendments 156 to 164 and 297.
Noble Lords may be aware that Clause 49 originally sought to clarify the relationship between the UK’s data protection legislation, specifically the UK GDPR and EU-derived aspects of the Data Protection Act 2018, and future data processing provisions in other legislation, such as powers to share or duties to disclose personal data, as a result of some legal uncertainty created by the European Union (Withdrawal) Act 2018. To resolve this uncertainty, Clause 49 makes it clear that all new data processing provisions in legislation should be read consistently with the key requirements of the UK data protection legislation unless it is expressly indicated otherwise. Since its introduction, the interpretation of pre-EU exit legislation has been altered and there is a risk that this would produce the wrong effect in respect of the interpretation of existing data processing provisions that are silent about their relationship with the data protection legislation.
Amendment 159 will make it clear that the full removal of the principle of EU law supremacy and the creation of a reverse hierarchy in relation to assimilated direct legislation, as provided for in the REUL Act, do not change the relationship between the UK data protection legislation and existing legislation that is in force prior to commencement of Clause 49(2). Amendment 163 makes a technical amendment to the EU withdrawal Act, as amended, to support this amendment.
Amendment 162 is similar to the previous amendment but it concerns the relationship between provisions relating to certain obligations and rights under data protection legislation and on restrictions and prohibitions on the disclosure of information under other existing legislation. Existing Section 186 of the Data Protection Act 2018 governs this relationship. Amendment 162 makes it clear that the relationship between these two types of provision is not affected by the changes to the interpretation of legislation that I have already referred to made by the REUL Act. Additionally, it clarifies that, in relation to pre-commencement legislation, Section 186(1) may be disapplied expressly or impliedly.
Amendment 164 relates to the changes brought about by the REUL Act and sets out that the provisions detailed in earlier Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act.
Amendment 297 provides a limited power to remove provisions that achieve the same effect as new Section 183A from legislation made or passed after this Bill receives Royal Assent, as their presence could cause confusion.
Finally, Amendments 156 and 157 are consequential. Amendments 158, 160 and 161 are minor drafting changes made for consistency, updating and consequential purposes.
Turning to the amendments introduced by the noble Lord, Lord Clement-Jones, I hope that he can see from the government amendments to Clause 49 that we have given a good deal of thought to the impact of the REUL Act 2023 on the UK’s data protection framework and have been prepared to take action on this where necessary. We have also considered whether some of the changes made by the REUL Act could cause confusion about how the UK GDPR and the Data Protection Act 2018 interrelate. Following careful analysis, we have concluded that they would largely continue to be read alongside each other in the intended way, with the rules of the REUL Act unlikely to interfere with this. Any new general rule such as that suggested by the noble Lord could create confusion and uncertainty.
Amendments 168 to 170, 174, 174A and 174B seek to reverse changes introduced by the REUL Act at the end of 2023, specifically the removal of EU general principles from the statute book. EU general principles and certain EU-derived rights had originally been retained by the European Union (Withdrawal) Act to ensure legal continuity at the end of the transition period, but this was constitutionally novel and inappropriate for the long term.
The Government’s position is that EU law concepts should not be used to interpret domestic legislation in perpetuity. The REUL Act provided a solution to this by repealing EU general principles from UK law and clarifying the approach to be taken domestically. The amendments tabled by the noble Lord, Lord Clement-Jones, would undo this important work by reintroducing to the statute book references to rights and principles which have not been clearly defined and are inappropriate now that we have left the EU.
The protection of personal data already forms part of the protection offered by the European Convention on Human Rights, under the Article 8 right to respect for private and family life, and is further protected by our data protection legislation. The UK GDPR and the Data Protection Act 2018 provide a comprehensive set of rules for organisations to follow and rights for people in relation to the use of their data. Seeking to apply an additional EU right to data protection in UK law would not significantly affect the way the data protection framework functions or enhance the protections it affords to individuals. Indeed, doing so may well add unnecessary uncertainty and complexity.
Amendments 171 to 173 pertain to exemptions to specified data subject rights and obligations on data controllers set out in Schedules 2 to 4 to the DPA 2018. The 36 exemptions apply only in specified circumstances and are subject to various safeguards. Before addressing the amendments the noble Lord has tabled, it is perhaps helpful to set out how these exemptions are used. Personal data must be processed according to the requirements set out in the UK GDPR and the DPA 2018. This includes the key principles of lawfulness, fairness and transparency, data minimisation and purpose limitation, among others. The decision to restrict data subjects’ rights, such as the right to be notified that their personal data is being processed, or limit obligations on the data controller, comes into effect only if and when the decision to apply an exemption is taken. In all cases, the use of the exemption must be both necessary and proportionate.
One of these exemptions, the immigration exemption, was recently amended in line with a court ruling that found it was incompatible with the requirements set out in Article 23. This exemption is used by the Home Office. The purpose of Amendments 171 to 173 is to extend the protections applied to the immigration exemption across the other exemptions subject to Article 23, apart from in Schedule 4, where the requirement to consider whether its application prejudices the relevant purposes is not considered relevant.
The other exemptions are each used in very different circumstances, by different data controllers—from government departments to SMEs—and work by applying different tests that function in a wholly different manner from the immigration exemption. This is important to bear in mind when considering these broad-brush amendments. A one-size-fits-all approach would not work across the exemption regime.
It is the Government’s position that any changes to these important exemptions should be made only after due consideration of the circumstances of that particular exemption. In many cases, these amendments seek to make changes that run counter to how the exemption functions. Making changes across the exemptions via this Bill, as the noble Lord’s amendments propose, has the potential to have significant negative impacts on the functioning of the exemptions regime. Any potential amendments to the other exemptions would require careful consideration. The Government note that there is a power to make changes to the exemptions in the DPA 2018, if deemed necessary.
For the reasons I have given, I look forward to hearing more from the noble Lord on his amendments, but I hope that he will not press them. I beg to move.
My Lords, I thank the Minister for that very careful exposition. I feel that we are heavily into wet towel, if not painkiller, territory here, because this is a tricky area. As the Minister might imagine, I will not respond to his exposition in detail, at this point; I need to run away and get some external advice on the impact of what he said. He is really suggesting that the Government prefer a pick ‘n’ mix approach to what he regards as a one size fits all. I can boil it down to that. He is saying that you cannot just apply the rules, in the sense that we are trying to reverse some of the impacts of the previous legislation. I will set out my stall; no doubt the Minister and I, the Box and others, will read Hansard and draw our own conclusions at the end, because this is a complicated area.
Until the end of 2023, the Data Protection Act 2018 had to be read compatibly with the UK GDPR. In a conflict between the two instruments, the provisions of the UK GDPR would prevail. The reversing of the relationship between the 2018 Act and the UK GDPR, through the operation of the Retained EU Law (Revocation and Reform) Act—REUL, as the Minister described it—has had the effect of lowering data protection rights in the UK. The case of the Open Rights Group and the3million v the Secretary of State for the Home Office and the Secretary of State for Digital, Culture, Media and Sport was decided after the UK had left the EU, but before the end of 2023. The Court of Appeal held that exemptions from data subject rights in an immigration context, as set out in the Data Protection Act, were overly broad, contained insufficient safeguards and were incompatible with the UK GDPR. The court disapplied the exemptions and ordered the Home Office to redraft them to include the required safeguards. We debated the regulations the other day, and many noble Lords welcomed them on the basis that they had been revised for the second time.
This sort of challenge is now not possible, because the relationship between the DPA and the UK GDPR has been turned on its head. If the case were brought now, the overly broad exemptions in the DPA would take precedence over the requirement for safeguards set out in the UK GDPR. These points were raised by me in the debate of 12 December, when the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 were under consideration. In that debate, the noble Baroness, Lady Swinburne, stated that
“we acknowledge the importance of making sure that data processing provisions in wider legislation continue to be read consistently with the data protection principles in the UK GDPR … Replication of the effect of UK GDPR supremacy is a significant decision, and we consider that the use of primary legislation is the more appropriate way to achieve these effects, such as under Clause 49 where the Government consider it appropriate”.—[Official Report, 12/12/23; col. GC 203.]
This debate on Clause 49 therefore offers an opportunity to reinstate the previous relationship between the UK GDPR and the Data Protection Act. The amendment restores the hierarchy, so that it guarantees the same rights to individuals as existed before the end of 2023, and avoids unforeseen consequences by resetting the relationship between the UK GDPR and the DPA 2018 to what the parliamentary draftsmen intended when the Act was written. The provisions in Clause 49, as currently drafted, address the relationship between domestic law and data protection legislation as a whole, but the relationship between the UK GDPR and the DPA is left in its “reversed” state. This is confirmed in the Explanatory Notes to the Bill at paragraph 503.
The purpose of these amendments is to restore data protection rights in the UK to what they were before the end of 2023, prior to the coming into force of REUL. The amendments would restore the fundamental right to the protection of personal data in UK law; ensure that the UK GDPR and the DPA continue to be interpreted in accordance with the fundamental right to the protection of personal data; ensure that there is certainty that assimilated case law that references the fundamental right to the protection of personal data still applies; and apply the protections required in Article 23 of the UK GDPR to all the relevant exemptions in Schedule 2 to the Data Protection Act. This is crucial in avoiding diminishing trust in our data protection frameworks. If people do not trust that their data is protected, they will refuse to share it. Without this data, new technologies cannot be developed, because these technologies rely on personal data. By creating uncertainty and diminishing standards, the Government are undermining the very growth in new technologies that they want.
My Lords, I have looked at the government amendments in this group and have listened very carefully to what the Minister has said—that it is largely about interpretation. There are no amendments that I wish to comment on, save to say that they seem to be about consistency of language and bringing in part EU positions into UK law. They seem also to be about consistency of meaning, and for the most part the intention seems to be to ensure that nothing in EU retained law undoes the pre-existing legal framework.
However, I would appreciate the Minister giving us a bit more detail on the operation of Amendment 164. Amendment 297 seems to deal with a duplication issue, so perhaps he can confirm for the Committee that this is the case. We have had swathes of government amendments of a minor and technical nature, largely about chasing out gremlins from the drafting process. Can he confirm that this is the case and assure the Committee that we will not be left with any nasty surprises in the drafting that need correction at a later date?
The amendments tabled in the name of the noble Lord, Lord Clement-Jones, are of course of a different order altogether. The first two—Amendments 165 and 166—would restore the relationship between the UK GDPR and the 2018 Act and the relevant provisions of the Retained EU Law (Revocation and Reform) Act 2023. Amendment 168 would ensure that assimilated case law referring to the European Charter of Fundamental Rights would still be relevant in interpreting the UK GDPR. It would give greater certainty in how the UK’s data protection framework is interpreted. Amendment 169 would ensure that the interpretation is carried over from the UK GDPR and 2018 legislation in accordance with the general principle of the protection of personal data.
The noble Lord’s Amendments 170 to 174B would bring back into law protections that existed previously when UK law was more closely aligned with EU law and regulation. There is also an extension of the EU data protection of personal data to the assimilated standard that existed by virtue of Section 4 of the European Union (Withdrawal) Act 2018. I can well understand the noble Lord’s desire to take the UK back to a position where we are broadly in the same place in terms of protections as our former EU partners. First, having—broadly speaking—protections that are common across multiple jurisdictions makes it easier and simpler for companies operating in those markets. Secondly, from the perspective of data subjects, it is much easier to comprehend common standards of data protection and to seek redress when required. The Government, for their part, will no doubt argue that there is some sort of big Brexit benefit in this, although I think that advisers and experts are divided on the degree of that benefit, and indeed who benefits.
Later, we will get to discuss data adequacy standards. Concern exists in some quarters as to whether we have this right and what this legislative opportunity might be missing to ensure that the UK meets those international standards that the EU requires. That is a debate for later, but we are broadly sympathetic to the desire of the noble Lord, Lord Clement-Jones, to find the highest level of protection for UK citizens. That is the primary motivation for many of the amendments and debates that we have had today. We do not want to weaken what were previously carefully crafted and aligned protections. I do not entirely buy the argument that the Minister made earlier about this group of amendments causing legal uncertainty. I believe it is the reverse of that: the noble Lord, Lord Clement-Jones, is trying to provide greater certainty and a degree of jurisdictional uniformity.
I hope that I have understood what the noble Lord is trying to achieve here. For those reasons, we will listen to the Minister’s concluding comments—and read Hansard—very carefully.
I thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their comments. As the noble Lord, Lord Clement-Jones, points out, it is a pretty complex and demanding area, but that in no way diminishes the importance of getting it right. I hope that in my remarks I can continue that work, but of course I am happy to discuss this: it is a very technical area and, as all speakers have pointed out, it is crucial for our purposes that it be executed correctly.
While the UK remains committed to strong protections for personal data through the UK GDPR and Data Protection Act, it is important that it is able to diverge from the EU legislation where this is appropriate for the UK. We have carefully assessed the effects of EU withdrawal legislation and the REUL Act and are making adjustments to ensure that the right effect is achieved. The government amendments are designed to ensure legal certainty and protect the coherence of the data protection framework following commencement of the REUL Act—for example, by maintaining the pre-REUL Act relationship in certain ways between key elements of the UK data protection legislation and other existing legislation.
The purpose of the REUL Act is to ensure that the UK has control over its laws. Resurrecting the principle of EU law supremacy in its entirety or continuing to apply case law principles is not consistent with the UK’s departure from the EU and taking back control over our own laws. These amendments make it clear that changes made to the application of the principle of EU law supremacy and new rules relating to the interpretation of direct assimilated legislation under the REUL Act do not have any impact on existing provisions that involve the processing of personal data.
The noble Lord, Lord Bassam, asked for more detail about Amendment 164. It relates to changes brought about by the REUL Act and sets out that the provisions detailed in Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act. The retrospective effect of this provision addresses the gap between the commencement of the REUL Act 2023 and the Data Protection and Digital Information Bill.
On the immigration exemption case, I note that it was confined to the immigration exemption and did not rule on the other exemptions. The Government will continue to keep the exemptions under review and, should it be required, the Government have the power to amend the other exemptions using an existing power in the DPA 2018. Before doing so, of course the Government would want to ensure that due consideration is given to how the particular exemptions are used. Meanwhile, I thank noble Lords for what has been a fascinating, if demanding, debate.
My Lords, we now move on to Part 2 of the Bill, which concerns the provision of digital verification services. In moving Amendment 177, I will also speak to the amendments through to Amendment 195; apart from one, all of them are in my name and have the support of the noble Lord, Lord Clement-Jones, for which I am grateful.
My Lords, I speak in favour of Amendment 195ZA in my name and that of the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, and Amendments 289 and 300 on digital identity theft. I am also very sympathetic to many of the points made by the noble Baroness, Lady Jones of Whitchurch, particularly about the most disadvantaged people in our society.
As many noble Lords know, I am a member of the Communications and Digital Committee of this House. A few months ago, we did a report on digital exclusion. We had to be quite clear about one of the issues that we found: even though some people may partly use digital—for example, they may have an email address—it does not make them digitally proficient or literate. We have to be very clear that, as more and more of our public and private services go online, it is obvious that companies and others will want to know which people are claiming to use these services. At the same time, a number of people will not be digitally literate or will not have this digital ID available. It is important that we offer them enough alternatives. It should be clear, and not beyond the wit of man or clever lawyers, that there are non-digital alternatives available for consumers and particularly, as was said by the noble Baroness, Lady Jones of Whitchurch, people from disadvantaged communities.
As we found in the report on our inquiry into digital exclusion, this does not concern only people from deprived areas. Sometimes people get by in life without much digital literacy. There are those who may be scared of it or who do not trust it, and they can come from all sorts of wealth brackets. This drives home the point that it is important to have an alternative. I cannot really say much more than the amendment itself; it does what it says on the tin. The amendment is quite clear and I am sure that the noble Lord, Lord Vaux, will speak to it as well.
I will briefly speak in favour of Amendments 289 and 300. Digital identity theft is clearly an issue and has been for a long time. Even before the digital days, identity theft was an issue and it is so much easier to hack someone’s ID these days. I have had bank accounts opened in my name. I received a letter claiming this but, fortunately, the bank was able to deal with it when I walked in and said, “This wasn’t me”. It is quite clear that this will happen more and more. Sometimes, it will simply be stealing data that has been leaked or because a system is not particularly secure; at other times, it will be because you have been careless. No matter why the crime is committed, it must be an offence in the terms suggested by the amendments of the noble Lord, Lord Clement-Jones. It is clear that we have to send a strong signal that digital identity theft is a crime and that people should be deterred from engaging in it.
My Lords, I have added my name to Amendment 195ZA—I will get to understand where these numbers come from, at some point—in the name of the noble Lord, Lord Kamall, who introduced it so eloquently. I will try to be brief in my support.
For many people, probably most, the use of online digital verification will be a real benefit. The Bill puts in place a framework to strengthen digital verification so, on the whole, I am supportive of what the Government are trying to do, although I think that the Minister should seriously consider the various amendments that the noble Baroness, Lady Jones of Whitchurch, has proposed to strengthen parliamentary scrutiny in this area.
However, not everyone will wish to use digital verification in all cases, perhaps because they are not sufficiently confident with technology or perhaps they simply do not trust it. We have already heard the debates around the advances of AI and computer-based decision-making. Digital identity verification could be seen to be another extension of this. There is a concern that Part 2 of the Bill appears to push people ever further towards decisions being taken by a computer.
I suspect that many of us will have done battle with some of the existing identity verification systems. In my own case, I can think of one bank where I gave up in deep frustration as it insisted on telling me that I was not the same person as my driving licence showed. I have also come up against systems used by estate agents when trying to provide a guarantee for my student son that was so intrusive that I, again, refused to use it.
Therefore, improving verification services is to be encouraged but there must be some element of choice, and if someone does not have the know-how, confidence, or trust in the systems, they should be able to do so through some non-digital alternative. They should not be barred from using relevant important services such as, in my examples, banking and renting a property because they cannot or would prefer not to use a digital verification service.
At the very least, even if the Minister is not minded to accept that amendment, I hope that he can make clear that the Government have no intention to make digital ID verification mandatory, as some have suggested that this Part 2 may be driving towards.
My Lords, this is quite a disparate group of amendments. I support Amendment 195ZA, which I have signed. I thought that the noble Baroness, Lady Jones, and the noble Lords, Lord Kamall and Lord Vaux, have made clear the importance of having a provision such as this on the statute book. It is important that an individual can choose whether to use digital or non-digital means of verifying their identity. It is important for the liberty and equality of individuals as well as to cultivate trust in what are essentially growing digital identity systems. The use of the word “empower” in these circumstances is important. We need to empower people rather than push them into digital systems that they may not be able to access. Therefore, a move towards digitalisation is not a justification for compelling individuals to use systems that could compromise their privacy or rights more broadly. I very much support that amendment on that basis.
I also very much support the amendments of the noble Baroness, Lady Jones, which I have signed. The Delegated Powers and Regulatory Reform Committee could not have made its recommendations clearer. The Government are serial offenders in terms of skeleton Bills. We have known that from remarks made by the noble Lord, Lord Hodgson, on the Government Benches over a long period. I am going to be extremely interested in what the Government have to say. Quite often, to give them some credit, they listen to what the DPRRC has to say and I hope that on this occasion the Minister is going to give us some good news.
This is an extremely important new system being set up by the Government. We have been waiting for the enabling legislation for quite some time. It is pretty disappointing, after all the consultations that have taken place, just how skeletal it is. No underlying principles have been set out. There is a perfectly good set of principles set out by the independent Privacy and Consumer Advisory Group that advises the Government on how to provide a simple, trusted and secure means of accessing public services. But what assurance do we have that we are going to see those principles embedded in this new system?
Throughout, it is vital that the Secretary of State is obliged to uphold the kinds of concerns being raised in the development of this DVS trust framework to ensure that those services protect the people who use them. We need that kind of parliamentary debate and it has been made quite clear that we need nothing less than that. I therefore very much support what the noble Baroness, Lady Jones, had to say on that subject.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Jones, and my noble friend Lord Kamall for their amendments. To address the elephant in the room first, I can reassure noble Lords that the use of digital identity will not be mandatory, and privacy will remain one of the guiding principles of the Government’s approach to digital identity. There are no plans to introduce a centralised, compulsory digital ID system for public services, and the Government’s position on physical ID cards remains unchanged. The Government are committed to realising the benefits of digital identity technologies without creating ID cards.
I shall speak now to Amendment 177, which would require the rules of the DVS trust framework to be set out in regulations subject to the affirmative resolution procedure. I recognise that this amendment, and others in this group, reflect recommendations from the DPRRC. Obviously, we take that committee very seriously, and we will respond to that report in due course, but ahead of Report.
Part 2 of the Bill will underpin the DVS trust framework, a document of auditable rules, which include technical standards. The trust framework refers to data protection legislation and ICO guidance. It has undergone four years of development, consultation and testing within the digital identity market. Organisations can choose to have their services certified against the trust framework to prove that they provide secure and trustworthy digital verification services. Certification is provided by independent conformity assessment bodies that have been accredited by the UK Accreditation Service. Annual reviews of the trust framework are subject to consultation with the ICO and other appropriate persons.
Requiring the trust framework to be set out in regulations would make it hard to introduce reactive changes. For example, if a new cybersecurity threat emerged which required the rapid deployment of a fix across the industry, the trust framework would need to be updated very quickly. Developments in this fast-growing industry require an agile approach to standards and rule-making. We cannot risk the document becoming outdated and losing credibility with industry. For these reasons, the Government feel that it is more appropriate for the Secretary of State to have the power to set the rules of the trust framework with appropriate consultation, rather than for the power to be exercised by regulations.
I turn to Amendments 178 to 195, which would require the fees that may be charged under this part of the Bill to be set out in regulations subject to the negative resolution procedure. The Government have committed to growing a market of secure and inclusive digital identities as an alternative to physical proofs of identity, for those that choose to use them. Fees will be introduced only once we are confident that doing so will not restrict the growth of this market, but the fee structure, when introduced, is likely to be complex and will need to flex to support growth in an evolving market.
There are built-in safeguards to this fee-charging power. First, there is a strong incentive for the Secretary of State to set fees that are competitive, fair and reasonable, because failing to do so would prevent the Government realising their commitment to grow this market. Secondly, these fee-raising powers have a well-defined purpose and limited scope. Thirdly, the Secretary of State will explain in advance what fees she intends to charge and when she intends to charge them, which will ensure the appropriate level of transparency.
The noble Baroness, Lady Jones, asked about the arrangements for the office for digital identities and attributes. It will not initially be independent, as it will be located within the Department for Science, Innovation and Technology. As we announced in the government response to our 2021 consultation, we intend for this to be an interim arrangement until a suitable long-term home for the governing body can be identified. Delegating the role of Ofdia—as I suppose we will call it—to a third party in the future, is subject to parliamentary scrutiny, as provided for by the clauses in the Bill. Initially placing Ofdia inside government will ensure that its oversight role could mature in the most effective way and that it supports the digital identity market in meeting the needs of individual users, relying parties and industry.
Digital verification services are independently certified against the trust framework rules by conformity assessment bodies. Conformity assessment bodies are themselves independently accredited by the UK Accreditation Service to ensure that they have the competence and impartiality to perform certification. The trust framework certification scheme will be accredited by the UK Accreditation Service to give confidence that the scheme can be efficiently and competently used to certify products, processes and services. All schemes will need to meet internationally agreed standards set out by the UK Accreditation Service. Ofdia, as the owner of the main code, will work with UKAS to ensure that schemes are robust, capable of certification and operated in line with the trust framework.
Amendment 184A proposes to exclude certified public bodies from registering to provide digital verification services. The term “public bodies” could include a wide range of public sector entities, including institutions such as universities, that receive any public funding. The Government take the view that this exclusion would be unnecessarily restrictive in the UK’s nascent digital identity market.
Amendment 195ZA seeks to mandate organisations to implement a non-digital form of verification in every instance where a digital method is required. The Bill enables the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them, nor does it insist that businesses which currently accept non-digital methods of verification must transition to digital methods. As Clause 52 makes clear, digital verification services are services that are provided at the request of the individual. The purpose of the Bill is to ensure that, when people want to use a digital verification service, they know which of the available products and services they can trust.
Some organisations operate only in the digital sphere, such as online-only banks and energy companies. To oblige such organisations to offer manual document checking would place obligations on them that would go beyond the Government’s commitment to do only what is necessary to enable the digital identity market to grow. In so far as this amendment would apply to public authorities, the Equality Act requires those organisations to consider how their services will affect people with protected characteristics, including those who, for various reasons, might not be able or might choose not to use a digital identity product.
Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?
I understand that some services are purely digital, but some of those may well not have digital ID. We do not know what future services there might be, so they might want to show an analogue ID. Is my noble friend saying that that will not be possible because it will impose too much of a burden on those innovative digital companies? Could he clarify what he said?
On this point, the argument that the Government are making is that, where consumers want to use a digital verification service, all the Bill does is to provide a mechanism for those DVSs to be certified and assured to be safe. It does not seek to require anything beyond that, other than creating a list of safe DVSs.
The Equality Act applies to the public sector space, where it needs to be followed to ensure that there is an absolute right to inclusive access to digital technologies.
My Lords, in essence, the Minister is admitting that there is a gap when somebody who does not have access to digital services needs an identity to deal with the private sector. Is that right?
In the example I gave, I was not willing to use a digital system to provide a guarantee for my son’s accommodation in the private sector. I understand that that would not be protected and that, therefore, someone might not be able to rent a flat, for example, because they cannot provide physical ID.
The Bill does not change the requirements in this sense. If any organisation chooses to provide its services on a digital basis only, that is up to that organisation, and it is up to consumers whether they choose to use it. It makes no changes to the requirements in that space.
I will now speak to the amendment that seeks to remove Clause 80. Clause 80 enables the Secretary of State to ask accredited conformity assessment bodies and registered DVS providers to provide information which is reasonably required to carry out her functions under Part 2 of the Bill. The Bill sets out a clear process that the Secretary of State must follow when requesting this information, as well as explicit safeguards for her use of the power. These safeguards will ensure that DVS providers and conformity assessment bodies have to provide only information necessary for the functioning of this part of the Bill.
My Lords, the clause stand part amendment was clearly probing. Does the Minister have anything to say about the relationship with OneLogin? Is he saying that it is only information about systems, not individuals, which does not feed into the OneLogin identity system that the Government are setting up?
It is very important that the OneLogin system is entirely separate and not considered a DVS. We considered whether it should be, but the view was that that comes close to mandating a digital identity system, which we absolutely want to avoid. Hence the two are treated entirely differently.
That is a good reassurance, but if the Minister wants to unpack that further by correspondence, I would be very happy to have that.
I am very happy to do so.
I turn finally to Amendments 289 and 300, which aim to introduce a criminal offence of digital identity theft. The Government are committed to tackling fraud and are confident that criminal offences already exist to cover the behaviour targeted by these amendments. Under the Fraud Act 2006, it is a criminal offence to make a gain from the use of another person’s identity or to cause or risk a loss by such use. Where accounts or databases are hacked into, the Computer Misuse Act 1990 criminalises the unauthorised access to a computer programme or data held on a computer.
Furthermore, the trust framework contains rules, standards and good practice requirements for fraud monitoring and responding to fraud. These rules will further defend systems and reduce opportunities for digital identity theft.
My Lords, I am sorry, but this is a broad-ranging set of amendments, so I need to intervene on this one as well. When the Minister does his will write letter in response to today’s proceedings, could he tell us what guidance there is to the police on this? Because when the individual, Mr Arron, approached the police, they said, “Oh, sorry, there’s nothing we can do; identity theft is not a criminal offence”. The Minister seems to be saying, “No, it is fine; it is all encompassed within these provisions”. While he may be saying that, and I am sure he will be shouting it from the rooftops in the future, the question is whether the police have guidance; does the College of Policing have guidance and does the Home Office have guidance? The ordinary individual needs to know that it is exactly as the Minister says, and identity theft is covered by these other criminal offences. There is no point in having those offences if nobody knows about them.
That is absolutely fair enough: I will of course write. Sadly, we are not joined today by ministerial colleagues from the Home Office, who have some other Bill going on.
I have no doubt that its contribution to the letter will be equally enjoyable. However, for all the reasons I set out above, I am not able to accept these amendments and respectfully encourage the noble Baroness and noble Lords not to press them.
My Lords, I suppose I am meant to say that I thank the Minister for his response, but I cannot say that it was particularly optimistic or satisfying. On my amendments, the Minister said he would be responding to the DPRRC in due course, and obviously I am interested to see that response, but as the noble Lord, Lord Clement-Jones, said, the committee could not have been clearer and I thought made a very compelling case for why there should be some parliamentary oversight of this main code and, indeed, the fees arrangements.
I understand that it is a fast-moving sector, but the sort of things that the Delegated Powers Committee was talking about was that the main code should have some fundamental principles, some user rights and so on. We are not trying to spell out every sort of service that is going to be provided—as the Minister said, it is a fast-moving sector—but people need to have some trust in it and they need to know what this verification service is going to be about. Just saying that there is going to be a code, on such an important area, and that the Secretary of State will write it, is simply not acceptable in terms of basic parliamentary democracy. If it cannot be done through an affirmative procedure, the Government need to come up with another way to make sure that there is appropriate parliamentary input into what is being proposed here.
On the subject of the fees, the Delegated Powers Committee and our amendment was saying only that there should be a negative SI. I thought that was perfectly reasonable on its part and I am sorry that the Minister is not even prepared to accept that perfectly suggestion. All in all, I thought that the response on that element was very disappointing.
The response was equally disappointing on the whole issue that the noble Lords, Lord Kamall and Lord Vaux, raised about the right not to have to use the digital verification schemes but to do things on a non-digital basis. The arguments are well made about the numbers of people who are digitally excluded. I was in the debate that the noble Lord referred to, and I cannot remember the statistics now, but something like 17% of the population do not have proper digital access, so we are excluding a large number of people from a whole range of services. It could be applying for jobs, accessing bank accounts or applying to pay the rent for your son’s flat or whatever. We are creating a two-tier system here, for those who are involved and those who are on the margins who cannot use a lot of the services. I would have hoped that the Government would have been much more engaged in trying to find ways through that and providing some guarantees to people.
We know that we are taking a big leap, with so many different services going online. There is a lot of suspicion about how these services are going to work and people do not trust that computers are always as accurate as we would like them to be, so they would like to feel that there is another way of doing it if it all goes wrong. It worries me that the Minister is not able to give that commitment.
I have to say that I am rather concerned by what the Minister said about the private sector—in effect, that it can already have a requirement to have digital only. Surely, in this brave new world we are going towards, we do not want a digital-only service; this goes back to the point about a whole range of people being excluded. What is wrong with saying, even to people who collect people’s bank account details to pay their son’s rent, “There is an alternative way of doing this as well as you providing all the information digitally”? I am very worried about where all this is going, including who will be part of it and who will not. If the noble Lords, Lord Kamall and Lord Vaux, wish to pursue this at a later point, I would be sympathetic to their arguments.
On identity theft, the noble Lord, Lord Clement-Jones, made a compelling case. The briefing that he read out from the Metropolitan Police said that your data is one of your most valuable assets, which is absolutely right. He also rightly made the point that this is linked to organised crime. It does not happen by accident; some major people are farming our details and using them for all sorts of nefarious activities. There is a need to tighten up the regulation and laws on this. The Minister read out where he thinks this is already dealt with under existing legislation but we will all want to scrutinise that and see whether that really is the case. There are lots of examples of where the police have not been able to help people and do not know what their rights are, so we just need to know exactly what advice has been given to the police.
I feel that the Minister could have done more on this whole group to assure us that we are not moving towards a two-tier world. I will withdraw my amendment, obviously, but I have a feeling that we will come back to this issue; it may be something that we can talk to the Minister about before we get to Report.
My Lords, this is a very small and modest amendment, adding a fifth element to a list. Clause 85 is very long, so I will try to keep to its key elements. The clause
“confers powers on the Secretary of State and the Treasury to make provision in connection with access to customer data and business data”.
It is particularly focused on information about
“the supply or provision of goods, services and digital content”
by a business. The four elements are these. The first is where it is “supplied or provided”; the second is “prices or other terms; the third is “how they are used”; and the fourth is “performance or quality”. That fourth element does not cover the specific issue that my modest Amendment 195A proposes to add: the energy and carbon intensity of goods, services or digital content.
This might be seen as an attempt at future-proofing and including something which is a fast-growing area of great consumer concern—it should be of government concern too in the light of the Climate Change Act and the Government’s responsibilities. It would add a modest piece of possibility. I stress that, as the explanatory statement says, this can be required; it does not demand that it has to be required, but it provides the possibility that it can be.
There is a parallel here. When you go into a shop to think about buying white goods because you need to replace a fridge or washing machine, you expect, as a matter of standard, to see an energy performance certificate that will tell you how much electricity it will use, or, in the case of gas cookers, how much energy. We now expect that as standard, but of course, that is not focused on what is in the appliance but on what it will use.
The other obvious example is energy performance certificates in relation to housing. Again, that is something that could probably be considerably improved, but there has been some step towards thinking about issues around energy use rather than what is put in. In that context of building, we are seeing a great deal of focus—and, increasingly, a great deal of planning focus —on the issue of embodied carbon in buildings. This is taking that further, in terms of goods, services and digital provision.
Perhaps the obvious reason why a future Government might want to do this is that, if we think of the many areas of this so-called green rating in environmental standards, we have seen a profusion of different standards, labels and models. That has caused considerable confusion and uncertainty for consumers. If a Government were to say that this was the kind of step that would be used, it would give a standard to apply across the digital fields that would be clearly understood and not open to gaming by bad actors, by just creating their own standard, and so on.
Take, for example, the Mintel sustainability barometer —it is a global study but is reflective, I think, of what is happening in the UK. Consumers are increasingly demanding this information; they really want to know the environmental impact, including the impact of the production of whatever they are purchasing. This is information that consumers really want.
The other thing that I would point to in terms of this future-proofing approach is the OECD’s Inclusive Forum on Carbon Mitigation Approaches. That is rather a mouthful. In February, it put out a study entitled—another mouthful—Towards more accurate, timely, and granular product-level carbon intensity metrics: A Scoping Note. That makes it clear that we are talking here about something that is for the future; something that is being developed, but developed fast. If we think about the Government’s responsibilities within the Climate Change Act and the public desire, this modest addition, providing the legislative framework for future action, is a small positive step. I beg to move.
My Lords, I shall speak to Amendment 218, which is in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Parminter. I thank them for their support.
I apologise to the Minister, because I think this amendment is typical of the increasing way in which we will see environmental and particularly climate change issues popping up in Bills that belong not to Defra, DESNZ or DLUHC but to other departments. Because there is the fundamental issue of many economic and other activities impacting on these issues, that will be a pattern for Bills. He is playing on unfamiliar turf on this one, I am sure, so I sympathise with him.
“This amendment would require Ministers and public authorities, such as regulators”
when they make significant announcements about policy change, to disclose any analysis they have done of the
“impact of announcements … on UK climate change mitigation targets, adaptation to climate impacts and nature targets”.
The sorts of announcements that this amendment refers to include the introduction of primary legislation, obviously; changes to the timing, level and scope of government targets; large public sector procurement contracts; big infrastructure spending commitments; and any other policies that have the potential to have significant impact on climate and nature targets and climate change adaptation.
I firmly believe, and I have the support of the clerks, that this accords with the provision in the Long Title of the Bill
“to make provision about the disclosure of information to improve public service delivery”
The information disclosed has to be accurate, timely and machine-readable. The Secretary of State would give guidance on the format of that disclosure following wide consultation with those involved, especially across all departments, because it will be an issue that involves all departments.
So why is the amendment needed? At the moment, the Government are required to publish a whole load of reports on environmental impacts but many of them are periodic, or possibly only annual and high level. For example, the Government are required to publish periodic high-level delivery plans on net zero under Sections 13 and 14 of the Climate Change Act. However, these leave unquantified many emissions savings and they are not revised at all when policies change.
The Government recently decided to delay the date of a ban on new fossil fuel cars and vans; to delay the proposed ban on further installation of oil, LPG and coal heating systems; and to delay the rollout of the clean heat market mechanism. The Government failed to report any greenhouse gas impacts from these measures, which were pretty substantial announcements. Indeed, the Secretary of State for DESNZ argued that it would not be appropriate, or a requirement, to update and publish a revised version of the carbon budget delivery plan every time that there was a change in policy. That is not what this amendment argues for; it reflects that one would think that, when such significant announcements were being made, the Government would have looked at what the impact on climate change issues would be.
The amendment would simply require the Government to publish any analysis that they have done on impact assessments or to publish the fact that they have not done any such analysis—one can draw one’s own conclusions from the fact that they have not done that. The Environmental Audit Committee in the other place, around the time of the announcements of which I gave examples, went so far as to challenge the Prime Minister to provide clarity on how the Government intended to fill the emission reduction gap caused by the proposed rollback of existing policies and did not get a satisfactory answer.
There are similar current arrangements for reports on adaptation and resilience to climate change. Section 56 and 58 of the Climate Change Act require, again, periodic reporting at a high level on adaptation to climate change. That legislation has not been updated when policies have changed. As far as the introduction of new legislation is concerned, Section 20 of the Environment Act requires a statement on environmental law by government when there is environmental content in any new Bill. However, we already know from bitter experience that the Government interpret “environmental content” rather tightly.
All but one of the 28 Bills considered by Parliament in this current Session stated that they did not contain environmental law at all, whereas we can see that several of them have a clear environmental impact. For example, the Economic Activity of Public Bodies (Overseas Matters) Bill—I should be talking now about an amendment on it across the way, as indeed, should the noble Baroness, Lady Bennett—could prevent public bodies from taking important environmental matters into account in their decision-making. However, at the time of that Bill being published, it was certified by Ministers as not containing any environmental law.
Currently, the Government publish impact assessments for new legislation, including environmental impact assessments where the proposals are expected to have an environmental impact. Again, this is interpreted very tightly by the Government. Of the 28 government Bills that we have considered in this Session, 24 reported negligible impact, zero impact or being not applicable in the greenhouse gas box of the appraisal form—or the whole box was left blank. No account was available of the evidence on which such ratings of not having any impact was based because we did not then get any environmental impact assessment. To give one example: the Offshore Petroleum Licensing Bill simply reported that impacts were not quantified, which is pretty staggering, bearing in mind the clear environmental implications of that Bill. One would think that licensing additional petroleum extraction from the North Sea has some environmental ramification.
We have talked about climate change impacts and adaptation impacts, and we have talked about legislation. With regard to public procurement, the Government and contracting authorities are not required to publish the greenhouse gas emissions associated with individual procurement contracts. We argued that one in the Procurement Bill and failed to get any movement. There is a procurement policy note guiding government departments to seek emission reductions plans from the firms that they are contracting with, but this is a non-statutory note—it is advice only—and it covers only the contracting companies’ own operations and not the impact emissions of the products of services being contracted for.
This is a slightly disparate group of amendments. I have added my name in support of Amendment 296, tabled by the noble Baroness, Lady Jones of Whitchurch, which once again probes the question of whether this Bill risks causing the loss of the data adequacy ruling from the EU. This was an issue raised by many, if not most, noble Lords during Second Reading, and it is an area in which the Government’s position feels a little complacent.
The data adequacy ruling from the EU is extremely important, as the impact assessment that accompanies the Bill makes clear. It says:
“Cross-border data transfers are a key facilitator of international trade, particularly for digitised services. Transfers underpin business transactions and financial flows. They also help streamline supply chain management and allow business to scale and trade globally”.
The impact assessment then goes on to estimate the costs of losing data adequacy, and indicates a net present value cost range of between £1.6 billion and £3.4 billion over the next 10 years. As an aside, I note that that is a pretty wide range, which perhaps indicates the extent to which the costs are really understood.
The impact assessment notes that these numbers are the impact on direct trade only and that the impact may be larger still when considering supply chain impacts, but it does not make any attempt to calculate that effect. There are big potential costs, however we look at it. It therefore seems extraordinary that the impact assessment, despite running to 240 pages, makes no attempt at all to quantify the probability that the EU might decide—and it is a unilateral EU decision—to withdraw the data adequacy ruling, which it can do at any time, even before the current ruling comes to an end in July 2025. I find it extraordinary that no attempt has been made to estimate the probability of that happening. You would think that, if the Government were as confident as they say they are, they should have some evidence as to the probability of it happening.
Noble Lords should be aware that this means that the potential cost of the loss of data adequacy is not included in the NPV analysis for the Bill. If that loss did occur, the net present value of the Bill would be largely wiped out, and if the lower end of the IA range is taken, the Bill’s overall financial impact becomes a net present cost to the tune of £2.1 billion. The retention of the EU data adequacy ruling is therefore key to retaining any real benefit from this Bill at all.
On Monday, the Minister said:
“We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU”.—[Official Report, 15/4/24; col. GC 261.]
By “they”, he means the measures in the Bill. So far, so good. But your Lordships will remember that, at the time of Brexit, there was actually considerable doubt as to whether we would be granted a data adequacy ruling at that time, when our rules were almost entirely convergent. This Bill increases divergence, so the approach at the moment seems complacent at best.
I do not think it is any surprise at all that our European Affairs Committee recently launched an inquiry into this very subject. While the Minister has said how confident he is, noises being made in the EU are less encouraging. For example, the chair of the European Parliament’s Civil Liberties, Justice and Home Affairs Committee wrote in February to the European Commissioner for Justice outlining his concerns about this Bill and questioning whether it will meet the requirements of “essential equivalence”, which is the test that we have to meet. He highlighted, in particular, the lack of independence of the Information Commissioner’s Office, and the elimination of the Biometrics and Surveillance Camera Commissioner, something we will come on to a little later.
It does not seem to be a given that data adequacy will be retained, despite the frankly rather woolly assurances from the Minister about his confidence. Given the enormous importance of the data adequacy ruling, and the fact that the impact assessment makes no attempt at all to assess the probability of retaining or losing it—something one would think to be really fundamental when deciding the extent of divergence we wish to follow—it must make sense to introduce the assessment proposed in Amendment 296. In the absence of something much stronger than the assurances the Minister has given so far, I urge the noble Baroness, Lady Jones, to return to this matter on Report: it is really fundamental.
My Lords, this group has three amendments within it and, as the noble Lord, Lord Vaux, said, it is a disparate group. The first two seem wholly benign and entirely laudable, in that they seek to ensure that concerns about the environmental impacts related to data connected to business are shared and provided. The noble Baroness, Lady Bennett, said hers was a small and modest amendment: I agree entirely with that, but it is valuable nevertheless.
If I had to choose which amendment I prefer, it would be the second, in the name of my noble friend Lady Young, simply because it is more comprehensive and seems to be of practical value in pursuing policy objectives related to climate change mitigation. I cannot see why the disclosure of an impact analysis of current and future announcements, including legislation, changes in targets and large contracts, on UK climate change mitigation targets would be a problem. I thought my noble friend was very persuasive and her arguments about impact assessment were sound. The example of offshore petroleum legislation effectively not having an environmental impact assessment when its impacts are pretty clear was a very good one indeed. I am one of those who believes that environmental good practice should be written all the way through, a bit like a stick of Brighton rock, and I think that about legislation. It is important that we take on board that climate change is the most pressing issue that we face for the future.
The third amendment, in the name of my noble friend Lady Jones, is of a rather different nature, but is no less important, as it relates to the UK’s data adequacy and the EU’s decisions on it. We are grateful to the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, for their support. Put simply, it would oblige the Secretary of State to complete an assessment, within six months of the Bill’s passing,
“of the likely impact of the Act on the EU’s data adequacy decisions relating to the UK”.
It would oblige the Secretary of State to lay a report on the assessment’s findings, and the report must cover data risk assessments and the impact on SMEs. It must also include an estimate of the legislation’s financial impact. The noble Lord, Lord Vaux, usefully underlined the importance of this, with its critical 2025 date. The amendment also probes
“whether the Government anticipate the provisions of the Bill conflicting with the requirements that need to be made by the UK to maintain a data adequacy decision by the EU”.
There is widespread and considerable concern about data adequacy and whether the UK legislative framework diverges too far from the standards that apply under the EU GDPR. The risk that the UK runs in attempting to reduce compliance costs for the free flow of personal data is that safeguards are removed to the point where businesses and trade become excessively concerned. In summary, many sectors including manufacturing, retail, health, information technology and particularly financial services are concerned that the free flow of data between us and the EU, with minimal disruption, will simply not be able to continue.
As the noble Lord, Lord Vaux, underlined, it is important that we in the UK have a relationship of trust with the European Commission on this, although ultimately data adequacy could be tested in the Court of Justice of the European Union. Data subjects in the EU can rely on the general principle of the protection of personal data to invalidate EU secondary and domestic law conflicting with that principle. Data subjects can also rely on the Charter of Fundamental Rights to bring challenges. Both these routes were closed off when the UK left the EU and the provisions were not saved in UK law, so it can be argued that data protection rights are already at a lower standard than across the European Union.
It is worth acknowledging that adequacy does not necessarily require equivalence. We can have different, and potentially lower, standards than the EU but, as long as those protections are deemed to meet whatever criteria the Commission chooses to apply, it is all to the good.
However, while divergence is possible, the concern that we and others have is that the Bill continues chipping away at standards in too many different ways. This chipping away is also taking place in statutory instruments, changes to guidance and so on. If His Majesty’s Government are satisfied that the overall picture remains that UK regulation is adequate, that is welcome, but it would be useful to know what mechanism DSIT and the Government generally intend using to measure where the tipping point might be achieved and how close these reforms take us to it.
The Committee will need considerable reassurance on the question of data adequacy, not least because of its impact on businesses and financial services in the longer term. At various times, the Minister has made the argument that a Brexit benefit is contained within this legislation. If he is ultimately confident of that case, what would be the impact on UK businesses if that assessment is wrong in relation to data adequacy decisions taken within the EU?
We are going to need more than warm words and a recitation that “We think it’s right and that we’re in the right place on data adequacy”. We are going to need some convincing. Whatever the Minister says today, we will have to return to this issue on Report. It is that important for businesses in this country and for the protection of data subjects.
My Lords, these amendments have been spoken to so well that I do not need to spend a huge amount of time repeating those great arguments. Both Amendment 195A, put forward by the noble Baroness, Lady Bennett, and Amendment 218 have considerable merit. I do not think that they conflict; they are complementary, in many respects.
Awareness raising is important to this, especially in relation to Amendment 218. For instance, if regulators are going to have a growth duty, which looks like it is going to happen, why not have countervailing duties relating to climate change, as the noble Baroness, Lady Young, put forward so cogently as part of Amendment 218? Amendment 195A also has considerable merit in raising awareness in the private sector, in traders and so on. Both have considerable merit.
My Lords, I thank the noble Baronesses, Lady Bennett, Lady Young of Old Scone and Lady Jones, for their proposed amendments on extending the definition of business data in smart data schemes, the disclosure of climate and nature information to improve public service delivery and the publication of an EU adequacy risk assessment.
On Amendment 195A, we consider that information about the carbon and energy intensity of goods, services or digital content already falls within the scope of “business data” as information about goods, services and digital content supplied or provided by a trader. Development of smart data schemes will, where relevant, be informed by—among other things—the Government’s Environmental Principles Policy Statement, under the Environment Act 2021.
With regard to Amendment 218, I thank the noble Baroness, Lady Young of Old Scone, for her sympathies; they are gratefully received. I will do my best in what she correctly pointed out is quite a new area for me. The powers to share information under Part 5 of the Digital Economy Act 2017—the DEA—are supplemented by statutory codes of practice. These require impact assessments to be carried out, particularly for significant changes or proposals that could have wide-ranging effects on various sectors or stakeholders. These impact assessments are crucial for understanding the implications of the Digital Economy Act and ensuring that it achieves its intended objectives, while minimising any negative consequences for individuals, businesses and society as a whole. As these assessments already cover economic, social and environmental impact, significant changes in approach are already likely to be accounted for. This is in addition to the duty placed on Ministers by the Environment Act 2021 to have due regard to the Environmental Principles Policy Statement.
Lastly, turning to Amendment 296, the Government are committed to maintaining their data adequacy decisions from the EU, which we absolutely recognise play a pivotal role in enabling trade and fighting crime. As noble Lords alluded to, we maintain regular engagement with the European Commission on the Bill to ensure that our reforms are understood.
The EU adequacy assessment of the UK is, of course, a unilateral, autonomous process for the EU to undertake. However, we remain confident that our reforms deliver against UK interests and are compatible with maintaining EU adequacy. As the European Commission itself has made clear, a third country—the noble Lord, Lord Clement-Jones, alluded to this point—is not required to have the same rules as the EU to be considered adequate. Indeed, 15 countries have EU adequacy, including Japan, Israel and the Republic of Korea. All these nations pursue independent and, often, more divergent approaches to data protection.
The Government will provide both written and oral evidence to the House of Lords European Affairs Committee inquiry on UK-EU data adequacy and respond to its final report, which is expected to be published in the summer. Many expert witnesses already provided evidence to the committee and have stated that they believe that the Bill is compatible with maintaining adequacy.
As noble Lords have noted, the Government have published a full impact assessment alongside the Bill, which sets out in more detail what both the costs and financial benefits of the Bill would be—including in the unlikely scenario of the EU revoking the UK’s adequacy decision. I also note that UK adequacy is good for the EU too: every EU company, from multinationals to start-ups, with customers, suppliers or operations in the UK relies on EU-UK data transfers. Leading European businesses and organisations have consistently emphasised the importance of maintaining these free flows of data to the UK.
For these reasons, I hope that the noble Baronesses will agree to withdraw or not move these amendments.
The Minister made the point at the end there that it is in the EU’s interest to agree to our data adequacy. That is an important point but is that what the Government are relying on—the fact that it is in the EU’s interest as much as ours to continue to agree to our data adequacy provisions? If so, what the Minister has said does not make me feel more reassured. If the Government are relying on just that, it is not a particularly strong argument.
Before the Minister stands up, let me just say that I absolutely agree with what the noble Lord, Lord Bassam, said. Have the Government taken any independent advice? It is easy to get wrapped up in your own bubble. The Government seem incredibly blithe about this Bill. You only have to have gone through our days in this Committee to see the fundamental changes that are being made to data protection law, yet the Government, in this bubble, seem to think that everything is fine despite the warnings coming from Brussels. Are they taking expert advice from outside? Do they have any groups of academics, for instance, who know about this kind of thing? It is pretty worrying. The great benefit of this kind of amendment, put forward by the noble Baroness, Lady Jones, is that nothing would happen until we were sure that we were going to be data adequate. That seems a fantastic safeguard to me. If the Government are just flying blind on this, we are all in trouble, are we not?
My Lords, can I point out, on the interests of the EU, that it does not go just one way? There is a question around investment as well. For example, any large bank that is currently running a data-processing facility in this country that covers the whole of Europe may decide, if we lose data adequacy, to move it to Europe. Anyone considering setting up such a thing would probably go for Europe rather than here. There is therefore an investment draw for the EU here.
I do not know what I could possibly have said to create the impression that the Government are flying blind on this matter. We continue to engage extensively with the EU at junior official, senior official and ministerial level in order to ensure that our proposed reforms are fully understood and that there are no surprises. We engage with multiple expert stakeholders from both the EU side and the UK side. Indeed, as I mentioned earlier, a number of experts have submitted evidence to the House’s inquiry on EU-UK data adequacy and have made clear their views that the DPDI reforms set out in this Bill are compatible with EU adequacy. We continue to engage with the EU throughout. I do not want to be glib or blithe about the risks; we recognise the risks but it is vital—
Could we have a list of the people the noble Lord is talking about?
Yes. I would be happy to provide a list of the people we have spoken to about adequacy; it may be a long one. That concludes the remarks I wanted to make, I think.
Perhaps the Minister could just tweak that a bit by listing not just the people who have made positive noises but those who have their doubts.
My Lords, I thank the Minister for his answer. This has been a fairly short but fruitful debate. We can perhaps commend the Minister for his resilience, although it feels like he was pounded back on the ropes a few times along the way.
I will briefly run through the amendments. I listened carefully to the Minister, although I will have to read it back in Hansard. I think he was trying to say that my Amendment 195A, which adds energy and carbon intensity to this list, is already covered. However, I really cannot see how that can be claimed to be the case. The one that appears to be closest is sub-paragraph (iv), which refers to “performance or quality”, but surely that does not include energy and carbon intensity. I will consider whether to come back to this issue.
The noble Baroness, Lady Young of Old Scone, presented a wonderfully clear explanation of why Amendment 218 is needed. I particularly welcome the comments from the noble Lord, Lord Bassam, expressing strong Labour support for this. Even if the Government do not see the light and include it in the Bill, I hope that the noble Lord’s support can be taken as a commitment that a future Labour Government intend to follow that practice in all their approaches.
I hope that the noble Baroness does not get too carried away on that one.
I am sure that we will revisit this at some point in future. Perhaps the noble Lord will like the fact that I am saying that it is certain that we will revisit it from a different place.
These are all really serious amendments. This is a long Committee stage but, in the whole issue of data, having regard to data adequacy is absolutely crucial, as the degree of intervention on the Minister indicated. The Green Party’s position is that we want to be rejoin-ready: we want to remain as close as possible to EU standards so that we can rejoin the EU as soon as possible.
Even without taking that approach, this is a crucial issue as so many businesses are reliant on this adequacy ruling. I was taken by a comment from the Minister, who said that the UK is committed to data adequacy. The issue here is not what the UK is saying but convincing the EU, which is not in our hands or under our control, as numerous noble Lords said.
I have no doubt that we will return to data adequacy and I hope that we will return to the innovative and creative intervention from the noble Baroness, Lady Young of Old Scone. In the meantime, I beg leave to withdraw Amendment 195A.
My Lords, for the convenience of the Committee and in view of the forthcoming votes, I think it would be helpful to pause here and return after the two votes have taken place. Is that agreeable?
My Lords, I would much rather not. We are due to end at 8.15 pm and I should like to hold to that. We seem to have some while before anything is going to happen. Shall we not just make progress?
All right, we shall make as much progress as we can.
Amendment 197A
My Lords, it is a pleasure to take part in today’s Committee proceedings. I declare my technology interests as an adviser to Boston Limited. It is self-evident that we have been talking about data but there could barely be a more significant piece of data than biometrics. In moving the amendment, I shall speak also to Amendments 197B and 197C, and give more than a nod to the other amendments in this group.
When we talk about data, it is always critical that we remember that it is largely our data. There could be no greater example of that than biometrics. More than data, they are parts and fragments of our very being. This is an opportune moment in the debate on the Bill to strengthen the approach to the treatment and the use of biometrics, not least because they are being increasingly used by private entities. That is what Amendments 197A to 197C are all about—the establishment of a biometrics office, a code of practice and oversight, and sanctions and fines to boot. This is of that level of significance. The Bill should have that strength when we are looking at such a significant part of our very human being and data protection.
Amendment 197B looks at reporting and regulatory requirements, and Amendment 197C at the case for entities that have already acted in the biometrics space prior to the passage of the Bill. In short, it is very simple. The amendments take principles that run through many elements of data protection and ensure that we have a clear statement on the use and deployment of biometrics in the Bill. There could be no more significant pieces of data. I look forward to the Minister’s response. I thank the Ada Lovelace Institute for its help in drafting the amendments, and I look forward to the debate on this group. I beg to move.
My Lords, I have added my name in support of the stand part notices of the noble Lord, Lord Clement-Jones, to Clauses 147, 148 and 149. These clauses would abolish the office of the Biometrics and Surveillance Camera Commissioner, along with the surveillance camera code of practice. I am going to speak mainly to the surveillance camera aspect, although I was taken by the speech of the noble Lord, Lord Holmes, who made some strong points.
The UK has become one of the most surveilled countries in the democratic world. There are estimated to be over 7 million CCTV cameras in operation. I give one example: the automated number plate recognition, ANPR, system records between 70 million and 80 million readings every day. Every car is recorded on average about three times a day. The data is held for two years. The previous Surveillance Camera Commissioner, Tony Porter, said about ANPR that it,
“must surely be one of the largest data gatherers of its citizens in the world. Mining of meta-data—overlaying against other databases can be far more intrusive than communication intercept”.
Professor Sampson, the previous commissioner, said about ANPR:
“There is no ANPR legislation or act, if you like. And similarly, there is no governance body to whom you can go to ask proper questions about the extent and its proliferation, about whether it should ever be expanded to include capture of other information such as telephone data being emitted by a vehicle or how it's going to deal with the arrival of automated autonomous vehicles”.
And when it came to independent oversight and accountability, he said:
“I’m the closest thing it’s got—and that’s nothing like enough”.
I am not against the use of surveillance cameras per se—it is unarguable that they are a valuable tool in the prevention and detection of crime—but there is clearly a balance to be found. If we chose to watch everything every person does all of the time, we could eliminate crime completely, but nobody is going to argue that to be desirable. We can clearly see how surveillance and biometrics can be misused by states that wish to control their populations—just look at China. So there is a balance to find between the protection of the public and intrusion into privacy.
Technology is moving incredibly rapidly, particularly with the ever-increasing capabilities of Al. As technology changes, so that balance between protection and privacy may also need to change. Yet Clause 148 will abolish the only real safeguards we have, and the only governance body that keeps an eye on that balance. This debate is not about where that balance ought to be; it is about making sure that there is some process to ensure that the balance is kept under independent review at a time when surveillance technologies and usage are developing incredibly rapidly.
I am sure that the Minister is going to argue that, as he said at Second Reading:
“Abolishing the Surveillance Camera Commissioner will not reduce data protection”.—[Official Report, 19/12/23; col. 2216.]
He is no doubt going to tell us that the roles of the commissioner will be adequately covered by the ICO. To be honest that completely misses the point. Surveillance is not just a question of data protection; it is a much wider question of privacy. Yes, the ICO may be able to manage the pure data protection matters, but it cannot possibly be the right body to keep the whole question of surveillance and privacy intrusion, and the related technologies, under independent review.
It is also not true that all the roles of the commissioner are being transferred to other bodies. The report by the Centre for Research into Surveillance and Privacy, or CRISP, commissioned by the outgoing commissioner, is very clear that a number of important areas will be lost, particularly reviewing the police handling of DNA samples, DNA profiles and fingerprints; maintaining an up-to-date surveillance camera code of practice with standards and guidance for practitioners and encouraging compliance with that code; setting out technical and governance matters for most public body surveillance systems, including how to approach evolving technology, such as Al-driven systems including facial recognition technology; and providing guidance on technical and procurement matters to ensure that future surveillance systems are of the right standard and purchased from reliable suppliers. It is worth noting that it was the Surveillance Camera Commissioner who raised the issues around the use of Hikvision cameras, for example—not something that the ICO is likely to be able to do. Finally, we will also lose the commissioner providing reports to the Home Secretary and Parliament about public surveillance and biometrics matters.
Professor Sampson said, before he ended his time in office as commissioner:
“The lack of attention being paid to these important matters at such a crucial time is shocking, and the destruction of the surveillance camera code that we’ve all been using successfully for over a decade is tantamount to vandalism”.
He went on to say:
“It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general … It seems absolutely senseless to destroy it now”.
The security industry does not want to see these changes either, as it sees the benefits of having a clear code. The Security Systems and Alarms Inspection Board, said:
“Without the Surveillance Camera Commissioner you will go back to the old days when it was like the ‘wild west’, which means you can do anything with surveillance cameras so long as you don’t annoy the Information Commissioner … so, there will not be anyone looking at new emerging technologies, looking at their technical requirements or impacts, no one thinking about ethical implications for emerging technologies like face-recognition, it will be a free-for-all”.
The British Security Industry Association said:
“We are both disappointed and concerned about the proposed abolition of the B&SCC. Given the prolific emergence of biometric technologies associated with video surveillance, now is a crucial time for government, industry, and the independent commissioner(s) to work close together to ensure video surveillance is used appropriately, proportionately, and most important, ethically”.
I do not think I can put it better than that.
While there may be better ways to achieve the appropriate safeguards than the current commissioner arrangement, this Bill simply abolishes everything that we have now and replaces the safeguards only partially, and only from a data protection perspective. I am open to discussion about how we might fill the gaps, but the abolition currently proposed by the Bill is a massively retrograde and even dangerous step, removing the only safeguards we have against the uncontrolled creep towards ever more intrusive surveillance of innocent people. As technology increases the scope for surveillance, this must be the time for greater safeguards and more independent oversight, not less. The abolition of the commissioner and code should not happen unless there are clear, better, safeguards established to replace it, and this Bill simply does not do that.
My Lords, I want to speak briefly in support of, first, the amendments in the name of my noble friend Lord Holmes, which would recreate the office of the Biometrics and Surveillance Camera Commissioner.
As I have done on a number of occasions, I shall tell a short story; it is about the Human Fertilisation and Embryology Authority. Noble Lords may wonder why I am starting there. I remember very clearly one of the first debates that I participated in when I was at university—far too long ago. It was at the Oxford Union, and Dame Mary Warnock came to speak about what was then a highly contentious use of new technology. In this country, we had that debate early; we established an authority to oversee what are very complex scientific and ethical issues. It has remained a settled issue in this country that has enabled many families to bear children, bringing life and joy to people in a settled and safe way.
This data issue is quite similar, I think. Other countries did not have that early debate, which I remember as a teenager, and did not establish a regulator in the form of the HFEA. I point to the US, which was torn apart by those very issues. As the noble Lord, Lord Vaux, has just set out, the public are very concerned about the use of biometric data. This is an issue that many sci-fi novels and films have been made about, because it preys on our deepest fears. I think that technology can be hugely valuable to society, but only if we build and maintain trust in it. In order to do that, you need consistent, long-standing, expert regulation.
Like the noble Lord, Lord Vaux, I do not understand why the changes that this Bill brings will make things better. It narrows the scope of protection to data protection only when, actually, the issues are much broader, much subtler and much more sophisticated. For that reason and that reason alone, I think that we need to remove these clauses and reinstate the regulator that exists today.
My Lords, I find myself in a fortunate position: we have made progress fast enough to enable me to go from one end of the Room to the other and play a modest part in this debate. I do so because, at an earlier stage, I identified the amendments tabled by the noble Lord, Lord Holmes, and I very much wish to say a few words in support of them.
Reference has already been made to the briefing that we have had from CRISP. I pay tribute to the authors of that report—I do not need to read long chunks of it into the record—and am tempted to follow the noble Lord in referring to both of them. I sometimes wonder whether, had their report been officially available before the Government drafted the Bill, we would find ourselves in the position we are now in. I would like to think that that would have had an effect on the Government’s thinking.
When I first read about the Government’s intention to abolish the post of the Biometrics and Surveillance Camera Commissioner, I was concerned, but I am not technically adept to know enough about it in detail. I am grateful for the advice that I have had from CRISP and from Professor Michael Zander, a distinguished and eminent lawyer who is a Professor Emeritus at LSE. I am grateful to him for contacting me about this issue. I want to make a few points on his and its behalf.
In the short time available to me, this is the main thing I want to say. The Government argue that abolishing these joint roles will
“reduce duplication and simplify oversight of the police use of biometrics”.
Making that simpler and rationalising it is at the heart of the Government’s argument. It sounds as if this is merely a tidying-up exercise, but I believe that that is far from the case. It is fair to accept that the current arrangements for the oversight of public surveillance and biometric techniques are complex, but a report published on 30 October, to which noble Lords’ attention has already been drawn, makes a powerful case that what the Government intend to do will result in losses that are a great deal more significant than the problems caused by the complexity of the present arrangements. That is the paper’s argument.
The report’s authors, who produced a briefing for Members’ use today, have presented a mass of evidence and provided an impressively detailed analysis of the issues. The research underpinning the report includes a review of relevant literature, interviews with leading experts and regulators—
My Lords, there is a Division in the Chamber. There are two votes back to back so the Committee will just come back as and when.
I do not have the benefit of seeing a Hansard update to know after which word I was interrupted and we had to leave to vote, so I will just repeat, I hope not unduly, the main point I was making at the time of the Division. This was that the central conclusion of the CRISP report is that the Government’s policy
“generates significant gaps in the formal oversight of biometrics and surveillance practices in addition to erasing many positive developments aimed at raising standards and constructive engagement with technology developers, surveillance users and the public”.
The reason I am very glad to support the noble Lord, Lord Holmes, in these amendments is that the complexities of the current regulatory landscape and the protections offered by the BSCC in an era of increasingly intensive advanced and intrusive surveillance mean that the abolition of the BSCC leaves these oversight gaps while creating additional regulatory complexity. I will be interested to see how the Minister defends the fact that this abolition is supposed to improve the situation.
I do not want to detain the Committee for very long, but I shall just read this one passage from the report into the record, because it is relevant to the debate we are having. We should not remove
“a mechanism for assuring Parliament and the public of appropriate surveillance use, affecting public trust and legitimacy at a critical moment concerning public trust in institutions, particularly law enforcement. As drafted, the Bill reduces public visibility and accountability of related police activities. The lack of independent oversight becomes amplified by other sections of the Bill that reduce the independence of the current Information Commissioner role”.
In short, I think it would be a mistake to abolish the biometrics commissioner, and on that basis, I support these amendments.
My Lords, it has been a pleasure to listen to noble Lords’ speeches in this debate. We are all very much on the same page and have very much the same considerations in mind. Both the protection of biometric data itself and also the means by which we regulate its use and have oversight over how it is used have been mentioned by everyone. We may have slightly different paths to making sure we have that protection and oversight, but we all have the same intentions.
The noble Lord, Lord Holmes, pointed to the considerable attractions of, in a sense, starting afresh, but I have chosen a rather different path. I think it was the noble Lord, Lord Vaux, who mentioned Fraser Sampson, the former Biometrics and Surveillance Camera Commissioner. I must admit that I have very high regard for the work he did, and also for the work of such people as Professor Peter Fussey of Essex University. Of course, a number of noble Lords have mentioned the work of CRISP in all this, which kept us very well briefed on the consequence of these clauses.
No one has yet spoken to the stand part notices on Clauses 130 to 132; I will come on to those on Clauses 147 to 149 shortly. The Bill would drastically change the way UK law enforcement agencies can handle biometric personal data. Clauses 130 to 132 would allow for data received from overseas law enforcement agencies to be stored in a pseudonymised, traceable format indefinitely.
For instance, Clause 130 would allow UK law enforcement agencies to hold biometric data received from overseas law enforcement agencies in a pseudonymised format. In cases where the authority ceases to hold the material pseudonymously and the individual has no previous convictions or only one exempt conviction, the data may be retained in a non-pseudonymous format for up to three years. Therefore, the general rule is indefinite retention with continuous pseudonymisation, except for a specific circumstance where non-pseudonymised retention is permitted for a fixed period. I forgive noble Lords if they have to read Hansard to make total sense of that.
This is a major change in the way personal data can be handled. Permitting storage of pseudonymised or non-pseudonymised data will facilitate a vast biometric database that can be traced back to individuals. Although this does not apply to data linked to offences committed in the UK, it sets a concerning precedent for reshaping how law enforcement agencies hold data in a traceable and identifiable way. It seems that there is nothing to stop a law enforcement agency pseudonymising data just to reattach the identifying information, which they would be permitted to hold for three years.
The clauses do not explicitly define the steps that must be taken to achieve pseudonymisation. This leaves a broad scope for interpretation and variation in practice. The only requirement is that the data be pseudonymised
“as soon as reasonably practicable”,
which is a totally subjective threshold. The collective impact of these clauses, which were a late addition to the Bill on Report in the Commons, is deeply concerning. We believe that these powers should be withdrawn to prevent a dangerous precedent being set for police retention of vast amounts of traceable biometric data.
The stand part notices on Clauses 147 to 149 have been spoken to extremely cogently by the noble Lord, Lord Vaux, the noble Viscount, Lord Stansgate, and the noble Baroness, Lady Harding. I will not repeat a great deal of what they said but what the noble Baroness, Lady Harding, said about the Human Fertilisation and Embryology Authority really struck a chord with me. When we had our Select Committee on Artificial Intelligence, we looked at models for regulation and how to gain public trust for new technologies and concepts. The report that Baroness Warnock did into fertilisation and embryology was an absolute classic and an example of how to gain public trust. As the noble Baroness, Lady Harding, said, it has stood the test of time. As far as I am concerned, gaining that kind of trust is the goal for all of us.
What we are doing here risks precisely the reverse by abolishing the office of the Biometrics and Surveillance Camera Commissioner. This was set up under the Protection of Freedoms Act 2012, which required a surveillance camera commissioner to be appointed and a surveillance camera code of practice to be published. Other functions of the Biometrics and Surveillance Camera Commissioner are in essence both judicial and non-judicial. They include developing and encouraging compliance with the surveillance camera code of practice; raising standards for surveillance camera developers, suppliers and users; public engagement; building legitimacy; reporting annually to Parliament via the Home Secretary; convening expertise to support these functions; and reviewing all national security determinations and other powers by which the police can retain biometric data. The Bill proposes to erase all but one—I stress that—of these activities.
The noble Lord, Lord Vaux, quoted CRISP. I will not repeat the quotes he gave but its report, which the noble Viscount, Lord Stansgate, also cited, warns that
“plans to abolish and not replace existing safeguards in this crucial area will leave the UK without proper oversight just when advances in artificial intelligence (AI) and other technologies mean they are needed more than ever”.
The Bill’s reduction of surveillance-related considerations to data protection compares unfavourably to regulatory approaches in other jurisdictions. Many have started from data protection and extended it to cover the wider rights-based implications of surveillance. Here, the Bill proposes a move in precisely the opposite direction. I am afraid this is yet another example of the Bill going entirely in the wrong direction.
My Lords, I thank all noble Lords who have contributed to what has been an excellent debate on this issue. We have all been united in raising our concerns about whether the offices of the biometrics commissioner and the surveillance camera commissioner should be abolished. We all feel the need for more independent oversight, not less, as is being proposed here.
As we know, the original plan was for the work of the biometrics commissioner to be transferred to the Information Commissioner, but when he raised concerns that this would result in the work receiving less attention, it was decided to transfer it to the Investigatory Powers Commissioner instead. Meanwhile, the office of the surveillance camera commissioner is abolished on the basis that these responsibilities are already covered elsewhere. However, like other noble Lords, we remain concerned that the transfer of this increasingly important work from both commissioners will mean that it does not retain the same level of expertise and resources as it enjoys under the current regime.
These changes have caused some alarm among civic society groups such as the Ada Lovelace Institute and the Centre for Research into Information Surveillance and Privacy, to which noble Lords have referred. They argue that we are experiencing a huge expansion in the reach of surveillance and biometric technology. The data being captured, whether faces, fingerprints, walking style, voice or the shape of the human body, are uniquely personal and part of our individual identity. The data being captured can enhance public safety but can also raise critical ethical concerns around privacy, free expression, bias and discrimination. As the noble Lord, Lord Vaux, said, we need a careful balance of those issues between protection and privacy.
The noble Baroness, Lady Harding, quite rightly said that there is increasing public mistrust in the use of these techniques, and that is why there is an urgent need to take people on the journey. The example the noble Baroness gave was vivid. We need a robust legal framework to underpin the use of these techniques, whether it is by the police, the wider public sector or private institutions. As it stands, the changes in the Bill do not achieve that reassurance, and we have a lot of lessons to learn.
Rather than strengthening the current powers to respond to the huge growth and reach of surveillance techniques, the Bill essentially waters down the protections. Transferring the powers from the BSCC to the new Information Commissioner brings the issue down to data protection when the issues of intrusion and the misuse of biometrics and surveillance are much wider than that. Meanwhile, the impact of Al will herald a growth of new techniques such as facial emotional appraisal and video manipulation, leading to such things as deep fakes. All these techniques threaten to undermine our sense of self and our control of our own personal privacy.
The amendment in the name of the noble Lord, Lord Holmes, takes up the suggestion, also made by the Ada Lovelace Institute, to establish a biometrics office within the ICO, overseen by three experienced commissioners. The functions would provide general oversight of biometric techniques, keep a register of biometric users and set up a process for considering complaints. Importantly, it would require all entities processing biometric data to register with the ICO prior to any use.
We believe that these amendments are a really helpful contribution to the discussion. They would place the oversight of biometric techniques in a more effective setting where the full impacts of these techniques can be properly monitored, measured and reported on. We would need more details of the types of work to be undertaken by these commissioners, and the cost implications but, in principle, we support these amendments because they seem to be an answer to our concerns. We thank the noble Lord for tabling them and very much hope the Minister will give the proposals serious consideration.
I thank my noble friend Lord Holmes, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, as well as other co-signatories for detailed examination of the Bill through these amendments.
I begin by addressing Amendments 197A, 197B and 197C tabled by my noble friend Lord Holmes, which seek to establish a biometrics office responsible for overseeing biometric data use, and place new obligations on organisations processing such data. The Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, and these functions will continue to sit with the new information commission, once established. For example, in March 2023 it investigated the use of live facial recognition in a retail security setting by Facewatch. In February 2024, it took action against Serco Leisure in relation to its use of biometric data to monitor attendance of leisure centre employees.
Schedule 15 to this Bill will also enable the information commission to establish committees of external experts with skills in any number of specialist areas, including biometrics, to provide specialist advice to the commission. Given that the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, the Government are therefore of the firm view that the information commission is best placed to continue to oversee the processing of biometric data. The Bill also allows the new information commission to establish specialist committees and require them to provide the commission with specialist advice. The committees may include specialists from outside the organisation, with key skills and expertise in specific areas, including biometrics.
The processing of biometric data for the purpose of uniquely identifying an individual is also subject to heightened safeguards, and organisations can process such data only if they meet one of the conditions of Article 9 of UK GDPR—for example, where processing is necessary to comply with employment law provisions, or for reasons of substantial public interest. Without a lawful basis and compliance with relevant conditions, such processing of biometric data is prohibited.
Amendments 197B and 197C in the name of my noble friend Lord Holmes would also impose new, prescriptive requirements on organisations processing, and intending to process, biometric data and setting unlimited fines for non-compliance. We consider that such amendments would have significant unintended consequences. There are many everyday uses of biometrics data, such as using your thumbprint to access your phone. If every organisation that launched a new product had to comply with the proposed requirements, it would introduce significant and unnecessary new burdens and would discourage innovation, undermining the aims of this Bill. For these reasons, I respectfully ask my noble friend not to move these amendments.
The Government deem Amendment 238 unnecessary, as using biometric data—
I am sorry, but I am wondering whether the Minister is going to say any more on the amendment in the name of the noble Lord, Lord Holmes. Can I be clear? The Minister said that the ICO is the best place to oversee these issues, but the noble Lord’s amendment recognises that; it just says that there should be a dedicated biometrics unit with specialists, et cetera, underneath it. I am looking towards the noble Lord—yes, he is nodding in agreement. I do not know that the Minister dismissed that idea, but I think that this would be a good compromise in terms of assuaging our concerns on this issue.
I apologise if I have misunderstood. It sounds like it would be a unit within the ICO responsible for that matter. Let me take that away if I have misunderstood—I understood it to be a separate organisation altogether.
The Government deem Amendment 238 unnecessary, as using biometric data to categorise or make inferences about people, whether using algorithms or otherwise, is already subject to the general data protection principles and the high data protection standards of the UK’s data protection framework as personal data. In line with ICO guidance, where the processing of biometric data is intended to make an inference linked to one of the special categories of data—for example, race or ethnic origin—or the biometric data is processed for the intention of treating someone differently on the basis of inferred information linked to one of the special categories of data, organisations should treat this as special category data. These protections ensure that this data, which is not used for identification purposes, is sufficiently protected.
Similarly, Amendment 286 intends to widen the scope of the Forensic Information Databases Service—FINDS—strategy board beyond oversight of biometrics databases for the purpose of identification to include “classification” purposes as well. The FINDS strategy board currently provides oversight of the national DNA database and the national fingerprint database. The Bill puts oversight of the fingerprint database on the same statutory footing as that of the DNA database and provides the flexibility to add oversight of new biometric databases, where appropriate, to provide more consistent oversight in future. The delegated power could be used in the medium term to expand the scope of the board to include a national custody image database, but no decisions have yet been taken. Of course, this will be kept under review, and other biometric databases could be added to the board’s remit in future should these be created and should this be appropriate. For the reasons I have set out, I hope that the noble Baroness, Lady Jones of Whitchurch, will therefore agree not to move Amendments 238 and 286.
Responses to the data reform public consultation in 2021 supported the simplification of the complex oversight framework for police use of biometrics and surveillance cameras. Clauses 147 and 148 of the Bill reflect that by abolishing the Biometrics and Surveillance Camera Commissioner’s roles while transferring the commissioner’s casework functions to the Investigatory Powers Commissioner’s Office.
Noble Lords referred to the CRISP report, which was commissioned by Fraser Sampson—the previous commissioner—and directly contradicts the outcome of the public consultation on data reform in 2021, including on the simplification of the oversight of biometrics and surveillance cameras. The Government took account of all the responses, including from the former commissioner, in developing the policies set out in the DPDI Bill.
There will not be a gap in the oversight of surveillance as it will remain within the statutory regulatory remit of other organisations, such as the Information Commissioner’s Office, the Equality and Human Rights Commission, the Forensic Science Regulator and the Forensic Information Databases Service strategy board.
One of the crucial aspects has been the reporting of the Biometrics and Surveillance Camera Commissioner. Where is there going to be and who is going to have a comprehensive report relating to the use of surveillance cameras and the biometric data contained within them? Why have the Government decided that they are going to separate out the oversight of biometrics from, in essence, the surveillance aspects? Are not the two irretrievably brought together by things such as live facial recognition?
Yes. There are indeed a number of different elements of surveillance camera oversight; those are reflected in the range of different bodies doing that it. As to the mechanics of the production of the report, I am afraid that I do not know the answer.
Does the Minister accept that the police are one of the key agencies that will be using surveillance cameras? He now seems to be saying, “No, it’s fine. We don’t have one single oversight body; we had four at the last count”. He probably has more to say on this subject but is that not highly confusing for the police when they have so many different bodies that they need to look at in terms of oversight? Is it any wonder that people think the Bill is watering down the oversight of surveillance camera use?
No. I was saying that there was extensive consultation, including with the police, and that that has resulted in these new arrangements. As to the actual mechanics of the production of an overall report, I am afraid that I do not know but I will find out and advise noble Lords.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services also inspects, monitors and reports on the efficiency and effectiveness of the police, including their use of surveillance cameras. All of these bodies have statutory powers to take the necessary action when required. The ICO will continue to regulate all organisations’ use of these technologies, including being able to take action against those not complying with data protection law, and a wide range of other bodies will continue to operate in this space.
On the first point made by the noble Lord, Lord Vaux, where any of the privacy concerns he raises concern information that relates to an identified or identifiable living individual, I can assure him that this information is covered by the UK’s data protection regime. This also includes another issue raised by the noble Lord—where the ANPR captures a number-plate that can be linked to an identifiable living individual—as this would be the processing of personal data and thus governed by the UK’s data protection regime and regulated by the ICO.
For the reasons I have set out, I maintain that these clauses should stand part of the Bill. I therefore hope that the noble Lord, Lord Clement-Jones, will withdraw his stand part notices on Clauses 147 and 148.
Clause 149 does not affect the office of the Biometrics and Surveillance Camera Commissioner, which the noble Lord seeks to maintain through his amendment. The clause’s purpose is to update the name of the national DNA database board and update its scope to include the national fingerprint database within its remit. It will allow the board to produce codes of practice and introduce a new delegated power to add or remove biometric databases from its remit in future via the affirmative procedure. I therefore maintain that this clause should stand part of the Bill and hope that the noble Lord will withdraw his stand part notice.
Clauses 147 and 148 will improve consistency in the guidance and oversight of biometrics and surveillance cameras by simplifying the framework. This follows public consultation, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication. The Government feel that a review, as proposed, so quickly after the Bill is enacted is unnecessary. It is for these reasons that I cannot accept Amendment 292 in the name of the noble Lord, Lord Clement-Jones.
I turn now to the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove Clauses 130 to 132. These clauses make changes to the Counter-Terrorism Act 2008, which provides the retention regime for biometric data held on national security grounds. The changes have been made only following a formal request from Counter Terrorism Policing to the Home Office. The exploitation of biometric material, including from international partners, is a valuable tool in maintaining the UK’s national security, particularly for ensuring that there is effective tripwire coverage at the UK border. For example, where a foreign national applies for a visa to enter the UK, or enters the UK via a small boat, their biometrics can be checked against Counter Terrorism Policing’s holdings and appropriate action to mitigate risk can be taken, if needed.
My Lords, to go back to some of the surveillance points, one of the issues is the speed at which technology is changing, with artificial intelligence and all the other things we are seeing. One of the roles of the commissioner has been to keep an eye on how technology is changing and to make recommendations as to what we do about the impacts of that. I cannot hear, in anything the noble Viscount is saying, how that role is replicated in what is being proposed. Can he enlighten me?
Yes, indeed. In many ways, this is advantageous. The Information Commissioner obviously has a focus on data privacy, whereas the various other organisations, particularly BSCC, EHRC and the FINDS Board, have subject-specific areas of expertise on which they will be better placed to horizon-scan and identify new emerging risks from technologies most relevant to their area.
Is the noble Viscount saying that splitting it all up into multiple different places is more effective than having a single dedicated office to consider these things? I must say, I find that very hard to understand.
I do not think we are moving from a simple position. We are moving from a very complex position to a less complex position.
Can the Minister reassure the Committee that, under the Government’s proposals, there will be sufficient reporting to Parliament, every year, from all the various bodies to which he has already referred, so that Parliament can have ample opportunity to review the operation of this legislation as the Bill stands at the moment?
Yes, indeed. The information commission will be accountable to Parliament. It is required to produce transparency and other reports annually. For the other groups, I am afraid that many of them are quite new to me, as this is normally a Home Office area, but I will establish what their accountability is specifically to Parliament, for BSSC and the—
Will the Minister write to the Committee, having taken advice from his Home Office colleagues?
My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.
When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.
(7 months, 2 weeks ago)
Grand CommitteeMy Lords, in moving Amendment 199, I will also speak to the other amendments in this group. In so doing, I declare an interest as the principal proprietor of the Good Schools Guide; we make a lot of use of cookies on our website.
I am completely in favour of what the Government are doing in this part of the Bill as an attempt to reduce cookie consent pollution. It is a tiresome system that we all go through at the moment. The fact that it is tiresome means that, most of the time, we just click on it automatically rather than going through to the details. In a way, it is self-defeating. What the Government are trying to do will very much improve the quality of people’s response to cookies and will make them more aware, in situations where they are asked for consent, that this is important.
However—this will be the request at the end of my speech—between Committee and Report, I would really like to sit down with any noble Lords who are interested and are representatives of the relevant industry to discuss how we should deal with cookies that relate to supporting advertisement delivery. A lot of the web relies on advertisements for the revenue to support itself. By and large, for a lot of sites that you are not asked to pay but from which you get a lot of value, that value is supported by advertising. As a website, if you are going to charge someone for delivering advertising, you have to be able to prove that the advertisement has been delivered and to tell them something about the person to whom you are delivering it. In this process, you are not interested in having individual information. What you want is collective information; you want to know that you have delivered 24,000 copies of this advertisement and know what the audience looks like. You absolutely do not want to end up with personal information.
Within that envelope—absolutely excluding the sorts of cookies that chase you around the internet saying, “Do you want a deckchair?”, just because you bought one two days ago—this is a vital part of the way the internet works at the moment. In Amendments 199 to 201, I suggest ways in which the clauses could be adapted to make sure that that use of cookies—as I say, it does not involve the sharing of personal information; it very much involves collective information—is allowed to continue uninterrupted.
My apologies to the noble Lord but his microphone does not seem to be working. I wonder whether he could speak more clearly.
It is but I do not think it is working. I do not know whether anybody else is having problems with it.
Okay. It does not quite reach me up here; I could sit down if that would be helpful.
No, carry on.
I will try to line up with it better. Amendments 202 to 205 flag concerns with proposed new Regulation 6B, which sets out to remove cookie banners automatically when the technology is available. The concerns very much relate to that last phrase: “when the technology is available”. How will this work? How is it to be managed? There is only a thin layer of controls on the Government in the way that they will use these new powers; it is also unclear how this will affect consumers and advertisers. There could be some far-reaching effects here. We just do not know.
I am looking for, and hope the Government will agree to, wide consultation because, on something such as this, it is never true that everybody knows everything. You want to put the consultation out to a lot of different people with a lot of different experiences of how to use the net to make sure that what you are doing will have the sort of effects that you want. I want to see proper, thoroughgoing impact assessments, including of the impact on competition and on the economic health of participants in the net. I would like to see a real analysis of the readiness of the technology, not just an assumption that, because somebody likes it, it will work, but a real, critical look at whether the technology is actually up to what it is hoped it will do, and proper testing, so that, in giving the Government the carte blanche that they have asked for with these clauses, we do not end up letting ourselves in for a disaster.
As I said, most of all, I am looking for a meeting between now and Report, so that I can go through these things in detail, and we can really understand the Government’s position on these matters and, if necessary, discuss them further on Report. I beg to move.
My Lords, I will speak briefly in support of the amendments in the name of the noble Lord, Lord Lucas, to which I am pleased to have added my name. I apologise for not being able to speak at Second Reading, but I understand from other Members of the Grand Committee that an occasional guest appearance and a different voice are welcome.
I declare an interest, as set out in the register, as a director of RSMB Ltd, a company specialising in the methodology of audience measurement, cross-media measurement and data integration. More fully, I am nominated and remunerated by the advertising group Havas, which owns the company jointly with Kantar Media.
As the noble Lord, Lord Lucas, so clearly set out in his introduction, these simple and uncontroversial amendments would bring greater clarity and certainty to the key measurement of users, readers and audiences of digital websites and platforms. By including the measurement of aggregate audiences online in the list of cookies that would not require specific consent, these amendments would protect and enhance the interests of both consumers and businesses: consumers because, as the noble Lord, Lord Lucas, said, with the maintenance of advertising revenue funding, websites that provide news, entertainment and a wealth of other services would otherwise cost those consumers much more in subscriptions; and businesses, as through the quality of anonymised, aggregated data, they can build better offers to consumers and advertisers, as well as increase their financial resilience.
The Minister brings profound knowledge and understanding of this field, so he well knows how important the digital advertising market is and how innovative and respected UK companies are in the global industry. That applies not only to the websites, platforms and advertisers but to the research, quality audit and measurement companies specialising in this area. These amendments would support this growing and productive high-tech data, research and measurement sector in reinforcing its world-leading position.
As in so many industries and sectors of the economy, long-term stability is vital to rapidly evolving digital markets. Including these amendments in the Bill, rather than relying on secondary legislation and regulation to flesh out details in the future, will enhance that stability.
Likewise, the amendments relating to the implementation of centralised opt-out controls are intended also to promote that long-term stability, as well bringing enhanced transparency and scrutiny. The interests of consumers and businesses are not in conflict with each other in relation to audience measurement and data quality. They are constructively interactive.
My Lords, Clause 109 makes changes to the regulations relating to the use of cookies, which, on the face of it, clarify and expand the PEC regulations. Some of the amendments seem benign enough, adding useful flexibility and much-needed clarity; others give the Secretary of State pretty wide-sweeping powers.
Taking a look at new Regulation 6A(1)(a), for example, a future Secretary of State will be able to add new exceptions to the cookie consent requirements. The regulation will also enable variations and omissions. All the Secretary of State would need to do is “consult” the commissioner and such other persons as the Secretary of State considers appropriate—so they will be left with some fairly wide powers and opportunities.
Before turning to Amendment 202 in the name of my noble friend Lady Jones of Whitchurch, I want to quickly respond to Amendments 199, 200 and 201, from the noble Lords, Lord Lucas and Lord Clement-Jones, and very ably supported by my noble friend Lord Chandos. These seek to introduce an additional exemption for cookies used for the purposes of non-intrusive audience measurement and ad performance, both of which are obviously very important to publishers, who need to understand how their websites are used and ensure that advertising is delivering revenue. It is famously hard to predict how successful advertising is; you are never quite sure whether the adverts are hitting home, but this sort of data is critical to that activity.
As noted by others, the Bill currently contains an exemption for cookies used solely for statistical purposes. It may be that the Minister is able to provide comfort to the publishing sector that audience measurement and ad performance are both areas that fall within this new exemption. If he cannot do that today, I hope he will be able to come back to interested colleagues in writing or, as the noble Lord, Lord Lucas, suggested, hold further discussions on this ahead of Report.
We had a number of significant debates during the passage of the Digital Markets, Competition and Consumers Bill regarding the fragility of the publishing sector. Newspapers, sectoral magazines and other sources fulfil a valuable role and we should seek to nurture that as far as is practical.
Amendment 202 in the name of my noble friend is another means of trying to support publishers by probing the potential consequences of the Government’s proposals around centralised cookie controls. Some users may happily accept cookies from the websites of trusted organisations, such as news sources that they use regularly, but generally decline cookies from other websites due to privacy concerns. I would like to know from the Minister how this nuance would be reflected if automatic preferencing is rolled out.
Organisations have also raised competition concerns. The number of mainstream internet browsers is incredibly small and they are operated by firms likely to be designated as having strategic market status under the digital markets Bill. If this legislation establishes a system that makes these browsers some kind of cookie gatekeeper, does that not risk amplifying existing competition barriers in digital markets, rather than bringing them down?
Our amendment would remove provisions around automatic cookie consent. Amendment 203 in the name of the noble Lord, Lord Lucas, proposes a different option, providing a straightforward means for users to override their general preference when using specific websites. That is an interesting alternative, and we need to listen carefully to the Minister’s reply because it gets to the heart of the issue. The noble Lord’s Amendment 204 would also be important, ensuring broader consultation before statutory instruments were brought forward under new Regulation 6B.
My Lords, I do not know how unusual this is, but we are on the same page across both sides of the Committee.
First, having signed the amendments by the noble Lord, Lord Lucas, I express my support for the first batch, Amendments 199 to 201, which are strongly supported by the Advertising Association and the Interactive Advertising Bureau for obvious reasons. The noble Lords, Lord Lucas and Lord Bassam, and the noble Viscount, Lord Chandos, have expressed why they are fundamental to advertising on the internet. Audience measurement is an important function, for media owners in particular, to determine the consumption of content and to price advertising space for advertisers.
I understand that the department, DSIT, has conceded that most of the use cases for audience measurement fit within the term “statistical purposes”. It is this area of performance that is so important. As the noble Lord, Lord Bassam, seemed to indicate, we may be within touching distance of agreement on that, but the Minister needs to be explicit about it so that the industry understands what the intent behind that clause really is. As a number of noble Lords have said, this is a specific and targeted exemption for audience measurement and performance cookies that limits the consent exemption for those purposes and, as such, should definitely be supported. I very much hope that, if the Minister cannot give the necessary assurance now, then, as a number of noble Lords have said, he will engage in further discussions.
Amendments 203, which I have signed, and 205 are extremely important too. Amendment 203, picked up clearly by the noble Lord, Lord Bassam, is potentially important; it could save an awful lot of aggravation for users on the internet. It is potentially game-changing given that, when we approach the same site—even Google—we have to keep clicking the cookie. I very much hope the Minister will see the sense in that because, if we are changing the EC regulations, we need to do something sensible and useful like that. It might even give the Bill a good name.
As all noble Lords have rightly said, the Secretary of State needs to think about the implementation of the regulations and what they will affect. Amendment 202 is fundamental and badly needed. You need only look at the list of those who are absolutely concerned about the centralisation of cookies: the Internet Advertising Bureau, the Advertising Association, the Data & Marketing Association, the Market Research Society, the News Media Association, the Incorporated Society of British Advertisers, the Association of Online Publishers and the Professional Publishers Association. I hope that the Government are in listening mode and will listen to their concerns.
As the PPA says, centralising cookie consent with browsers could cause consumers far more harm than good. The Secretary of State’s powers would override cookie consent relationships between individuals and specialist publishers, which the noble Lord, Lord Bassam, talked about in particular. As the PPA says, in all likelihood a significant number of internet users would not consent to cookies from the browser but would consent to cookies on the websites of publishers that they know and trust. If the Secretary of State were to use this power to enforce cookie centralisation, many publishing businesses would be forced to present consumers with paywalls in order to be financially sustainable. As the PPA says, this would lead to consumers missing the opportunity to access high-quality publishing content without having to pay a fee.
The PPA has made an extremely good case. This would amplify existing barriers to competition in the digital market. There are provisions in the DMCC Bill that would give powers to the CMA to address any problems, such as enforced data sharing from platforms to publishers, but centralising cookie consent would completely undermine the objectives of that legislation. It is clear that this Bill should be amended to withdraw the provisions giving the Secretary of State the power to introduce these centralised cookie controls. I very much hope that the Minister will have second thoughts, given the weight of opinion and the impact that the Secretary of State’s powers would have.
My Lords, if the Committee will indulge me, I was a little late arriving for the introduction to this group of amendments by my noble friend Lord Lucas, but I heard most of what he said and I will speak briefly. I am quite sympathetic to the arguments about the exemption being too tightly drawn and the advantage that this is likely to give the likes of Google and Meta in the advertising ecology. As the noble Lord, Lord Clement-Jones, said, a range of different trade bodies have raised concerns about this, certainly with me.
From my perspective, the other point of interest that I want to flag is that the Communications and Digital Committee is currently doing an inquiry into the future of news. As part of the evidence that we have taken in that inquiry, one of our witnesses from the news industry raised their concerns about a lack of joined-up thinking, as they described it, within government when it comes to various different bits of legislation in which there are measures that are inadvertently detrimental to the news or publishing industry because there has been no proper understanding or recognition of how the digital news environment is now so interconnected. Something like this, on cookies, could have quite a profound effect on the news and publishing industry, which we know is reliant on advertising and is increasingly feeling the pinch because the value that it gets from digital advertising is being squeezed all the time. I just wanted to reinforce the point, for the benefit of my noble friend the Minister, that concern about this is widespread and real.
My Lords, it is a pleasure to make my first foray at the Dispatch Box on this Bill in what has been an interesting Committee stage thus far. I thank my noble friend Lord Lucas and the noble Baroness, Lady Jones of Whitchurch, for tabling these amendments and other noble Lords who have signed and spoken to them in support.
Many people are irritated by repetitive pop-ups that appear on websites seeking consent for cookies and other similar technologies. The current cookie rules apply to all organisations placing cookies on a person’s device. Rather than engaging with these banners, people will select “accept all” so that they can access the webpage as quickly as possible. We want users to be able to make more meaningful choices over their privacy. One way in which web users may be able to reduce the number of consent pop-up banners that they see is by using automated consent management technology.
New Regulation 6B, which Amendment 202 seeks to remove, is important as it will allow the Secretary of State to require relevant technologies to meet certain standards or specifications, thereby ensuring that individuals using this technology have effective control over their privacy when they are online. Amendment 203 seeks to amend Regulation 6B by making it clear that consents given on individual websites should override any prior choices made using automated technology. However, this could pre-empt the outcome of consultation with relevant sectors, civil society and regulators on the design of any new regulations. I fear that this amendment could have the effect of encouraging the continued use of consent banners, may not reduce the overall number of pop-up banners and could increase the risk of influencing consumers to give up more personal data than they intended.
We feel that Amendments 204 and 205 are unnecessary and duplicate existing requirements and standard practice. There is already a requirement in new Regulation 6B to consult. We have engaged extensively with stakeholders on this Bill and will continue to do so in the context of using any of the new regulation-making powers linked to these clauses. Our engagement so far has highlighted the complexity of the ecosystem and the range of impacts on different interest groups. We will continue to consider these impacts carefully when considering whether to use the new regulation-making powers. Impact assessments are generally required for all interventions of a regulatory nature that affect the private sector, civil society organisations and public services.
The Government have taken powers in the Bill to remove consent requirements for other purposes if the evidence supports it while recognising that this is a complex and technical market. The Government will therefore continue to engage fully with all players before introducing any new exemptions or deciding to set standards for the market.
The new power in Regulation 6B recognises that there is a range of different stakeholder interests that would need to be considered before making regulations. The Secretary of State must consult the Information Commissioner, the Competition and Markets Authority and any other person the Secretary of State considers appropriate. While browser-based or centralised consent options have been discussed as a possible solution, nothing in the Bill mandates them. The regulation-making power, which follows the affirmative resolution procedure, would allow the Secretary of State to set standards of design that will be key to ensuring that the regulations can move with technology.
Amendments 199 and 200 would permit the storage of information or accessing information stored on a person’s connected device, including the internet of things, to enable the organisation to generate audience measurement information. This proposed new exemption does not explain what data would need to be gathered to meet the objective of the amendment and is potentially broad in its application. For example, if it permitted activities such as tracking and profiling, it may not be appropriate to permit it without the consent of web users.
I am interested in the Minister’s point about the flexibility the Government see in this clause, but I am not sure who in the end has the responsibility to lead on that flexibility. Will it come from the commissioner or be driven by the Secretary of State’s considerations? The consultation duties seem very dependent on the commissioner’s view and I am not sure at what stage the Secretary of State would want to intervene to ensure that they have got this bit right. That is very important, because the balance is quite sophisticated.
The Minister used the expression “when the evidence emerges”, as did the noble Viscount, Lord Camrose, in another context last week. I would have thought that these organisations know what they are about, and they have provided some pretty comprehensive evidence about the impact on their businesses. Is that not a pretty good reason for the Government to think that they might not have this set of provisions entirely right, quite apart from the other aspects of this group of amendments? If that evidence is not enough—I read out the list of organisations—the Government are more or less saying that they will not accept any evidence.
I thank both noble Lords for their interventions. On the point from the noble Lord, Lord Bassam, there is a trifecta of decision-making between the Secretary of State, the ICO and the organisations all working together. That is why there is a consultation requirement before using the power. On the point from the noble Lord, Lord Clement-Jones, it is a question of your point of view; we feel that we have done stakeholder engagement and believe that we have got the balance right between the needs of organisations—
Will the Minister write and unpack exactly what the balance of opinion was? We are talking about pretty crucial stuff here. It is not always a question just of numbers; it is quite often a question of weighting the arguments. The Minister should write to us and tell us how they came to that conclusion, because the case was clearly being made during the consultation, but the Government have effectively ignored it.
In this tripartite geography that the noble Lord described, the power—
I am not a gambling man. It is an interesting term. The Minister is suggesting that power rests equally among those three elements but it does not. The Secretary of State is the all-powerful being and the commissioner is there to ensure that regulation works effectively. How will this operate in practice? There is no advisory body here; it is the Secretary of State having a discussion with the commissioner and then, on the balance of some of the consultation information that comes in, making a decision. That will not enable the sector, the market and those providers to be engaged.
I thank noble Lords for those further points requesting clarification. On how we have come to this decision, I am happy to write to all noble Lords in the Committee. The noble Lord went in an interesting direction because, in the context of the rest of the Bill, so many of the amendments have been about protecting private users, but the noble Lord seems to be swaying more in favour of the advertisers here.
My Lords, it is all about the relative importance and the weighting. Maybe that is a good illustration of where the Government are not getting their weighting correct for the beginning and this part of the Bill.
I take the noble Lord’s point. We are working with industry and will continue to do so. For the benefit of the Committee, we are, as I said, happy to write and explain the points of view, including those from Data: A New Direction. In response to the noble Lord, Lord Bassam, power ultimately lies with Parliament via the affirmative resolution procedure for the Secretary of State power.
I will go back to the amendments we were discussing. This regulation applies to complex and technical markets. The very reason we have taken a delegated power is so that the new exemptions can be carefully created in consultation with all affected stakeholders. As I explained, the Bill includes a requirement to consult the Information Commissioner, the Competition and Markets Authority and any other relevant stakeholders, which would include trade associations and consumers or web users.
Amendment 201 would widen the application of the “strictly necessary” exemption. Currently, it applies only to those purposes essential to provide the service requested by the user. Amendment 201 would extend this exemption so that it applies to the purposes considered essential to the website owner. We do not think this would be desirable, as it would reduce a user’s control over their privacy in a way that they might not expect.
For the reasons I have set out—and once again reaffirming the commitment to write to noble Lords on how the weighting was worked out—I hope my noble friend and the noble Baroness will not press their amendments.
My Lords, my noble friend makes a good point. I can promise all Members that there will be thematic meetings between Committee and Report.
My Lords, I am grateful for that assurance from my noble friend.
On the first amendments, clearly, we are dealing with something that is quite tricky and technical. My noble friend sees these amendments in a different light to me. It is possible that my drafting may be imperfect; that has never happened before, of course, but there is always a first time. Therefore, I seek an opportunity to look at this issue in detail. It is absolutely not my objective to engage the objections; this is something where my noble friend’s objections are valid. My amendment is not intended in any way to allow tracking or profiling. If I am wording things imperfectly or imagining something that just cannot be achieved in practice, the best way to deal with these matters would be to hammer them out in a technical discussion, not in Committee. I would happily look to an opportunity to do that between Committee and Report.
When it comes to new Regulation 6B and its ramifications, as the debate has gone on, I have found myself favouring more and more the amendment in the name of the noble Baroness, Lady Jones of Whitchurch. This is an uncontrolled bit of power that we are looking to give the Government, with some serious implications. It should not be done. We should wait until the technology is available and then do something when we can really take our time to look at the options. Again, this is something that we will have a chance to talk through.
It is really important that, in doing what seems to be convenient—as my noble friend put it, it is about getting rid of an irritation and making the whole process of giving permission much more effective; I am absolutely with him on that—we make sure that we are not letting ourselves in for some greater dangers. I personally want to make sure of that. The oldies among us—most of us, I suspect—will remember when Google said, “Don’t be evil”. I wish that it had kept to that.
For now, I beg leave to withdraw my amendment.
My Lords, in moving this amendment, I will also speak to the other amendments in this group in the name of my noble friend Lady Jones of Whitchurch: Amendments 209 to 211 and 215.
It is estimated that a staggering 134 million personal injury compensation calls and texts have been made and sent in the UK in the past 12 months. YouGov research shows that more than 20 million people were contacted by companies touting for business through injury compensation claims. Personally, I have had more than my fair share, so I suppose I must declare an interest in this issue.
However, unsolicited calls are more than just a modern-day nuisance. If people have suffered an accident, they can be reminded of the trauma. People’s hopes of compensation can be raised cynically and unrealistically in order to encourage them to share personal financial information that can then be used to scam them out of their money. Research shows strong emotional responses to these calls. People are left feeling angry, anxious, disgusted and upset. That is hardly a surprise when they are being pestered in their own homes or on their own phones.
My Lords, I support Amendment 208A. I declare my interest as a solicitor but not one who has been directly involved with personal injury claims. This is an area of particular specialism that requires particular expertise and experience for it to be carried out to the best advantages of those who seek that help.
Looking back, I am concerned that this matter has been raised, in different fora, on a number of occasions. For instance, in 2016, the Telephone Preference Scheme opt-out was discussed when it was removed from the control of Ofcom to that of the ICO. At that point, there was a great opportunity for this matter to be dealt with. Indeed, a number of organisations, including personal injury lawyers, the Motor Accident Solicitors Society and others, said that it was vital to carry this out and that cold calling should be ended because of the pressures it placed on an awful lot of very vulnerable people.
Since 2016, things have got worse in one respect—although, perhaps, they are a little less bad in respect of telephone calling. It is a little while now since I was last told that I had just had a major accident in my car as I was sitting enjoying a glass of wine and not having such worries in my mind. Telephone cold calling seems to have diminished but pressures through social media contact, various scams and so on have increased dramatically. I have been told this by a number of my legal colleagues.
In 2023, the Government produced the UK’s Fraud Strategy. As I am sure noble Lords will know, when it was published, it specifically pursued the question of extending the ban on cold calling to personal injury cases; that was very important and included all servers. So, unless there is some relationship already in place—something where that is a defence, as it were, here—and a voluntary willingness on the part of those who suffer from personal injuries to be contacted by an organisation with which they already have a relationship, this is something that we should pursue very strongly indeed.
Although it is correct that the legal profession, and perhaps other professions, are banned from this procedure, on a regulatory or disciplinary basis, some of my colleagues in the profession are, in some cases, susceptible to financial and commercial challenges through these organisations, such that they would become—sometimes, almost inadvertently—part of the process. Therefore, I hope that, in passing such an amendment, we would give a clear sign to the Solicitors Regulation Authority and the Law Society that it underlines yet again that these practices are not acceptable to those members of the profession.
My Lords, I support Amendment 208A. I am a recovering solicitor. Many moons ago, I gave public affairs advice to the Association of Personal Injury Lawyers, which is a fine organisation. I very much support its call and this amendment on that basis. I congratulate the noble Lord, Lord Leong, on his introduction to this amendment; he and the noble Lord, Lord Kirkhope, made a terrific case.
APIL took the trouble to commission research from YouGov, which showed that 38% of UK adults had received a cold call or text while 86% had a strong emotional response and were left feeling annoyed, angry, anxious, disgusted or upset. Therefore, the YouGov research reveals that almost all those who received a call supported a total ban on personal injury cold calls and text messages.
There is little for me to add but I am sorry that the noble Baroness, Lady Buscombe, is not with us—she has just exited the Room, which is unhappy timing because, in looking back at some of the discussions we have had in the House, I was about to quote her. During Report stage in the Lords on the Financial Guidance and Claims Bill, when she was a Minister, she told us:
“We know that cold calls continue and understand that more needs to be done truly to eradicate this problem. We have already committed to ban cold calls relating to pensions, and are minded to bring forward similar action in relation to the claims management industry. I have asked officials to consider the evidence for implementing a cold-calling ban in relation to claims management activities, and I am pleased to say that the Government are working through the detail of a ban on cold calling by claims management companies. There are complex issues to work through, including those relating, for example, to EU directives”;
of course, we do not have those any more. She went on to say:
“We would therefore like time to consider this important issue properly, and propose bringing forward a government amendment in the other place to meet the concerns of this House”.—[Official Report, 24/10/17; col. 861.]
How much time do the Government need? Talk about unfinished business. I know it is slightly unfair as you can unearth almost anything in Hansard but the fact is that this is bull’s eye. It is absolutely spot on on the part of APIL to have found this. I thought for one delirious minute that the noble Baroness, Lady Buscombe, was going to stand up and say, “Yes, I plead guilty. We never pursued this”.
I have texted the noble Baroness asking her to return as soon as possible so that she can listen to the noble Lord’s wise words.
I am not going to carry on much longer. I know that that will be a grave disappointment but it makes the case, I think, that it is high time that the Government did something in this area. It is clearly hugely unpopular. We need to make sure that Amendment 208A is passed. If not now, when?
My Lords, I thank the noble Baroness, Lady Jones of Whitchurch, for tabling Amendment 208A and the noble Lord, Lord Leong, for moving it. This amendment would insert new Regulation 22A into the privacy and electronic communications regulations and would prohibit via email or text unsolicited approaches encouraging people to commence personal injury claims sent by, or on behalf of, claims management companies.
The Government agree that people should not receive unsolicited emails and texts from claims management companies encouraging them to make personal injury claims. I assure noble Lords that this is already unlawful under the existing regulations. Regulation 22(2) prohibits the sending of all unsolicited electronic communications direct marketing approaches—including, but not limited to, texts and emails—unless the recipient has previously consented to receiving the communication. Regulation 21A already bans live calling by claims management companies.
In the past year, the Information Commissioner has issued fines of more than £1.1 million to companies that have not adhered to the direct marketing rules. Clause 117 considerably increases the financial penalties that can be imposed for breaches of the rules, providing a further deterrent to rogue claims management and direct marketing organisations.
Amendments 211 and 215 relate to Clause 116 so I will address them together. Amendment 211 seeks to confirm that a provider of a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with the new duty introduced by Clause 116. I assure the noble Baroness and the noble Lord that the duty is a duty to share information only. It merely requires providers to share any information that they already hold or gather through routine business activities and which may indicate suspicious unlawful direct marketing on their networks; it does not empower, authorise or compel a communications provider to intercept messages or listen to phone calls.
Should a communications provider become aware of information through its routine business activities that indicates that unlawful direct marketing activity may be taking place on its service or network, this duty simply requires it to share that information with the Information Commissioner. For example, a communications provider may receive complaints from its subscribers who have received numerous unsolicited direct marketing communications from a specific organisation. We know from the public consultation that people want action taken against nuisance calls and spam, and this duty will support that.
My Lords, I thank all noble Lords who have spoken, especially the noble Lords, Lord Kirkhope and Lord Clement-Jones, who have kindly supported this amendment.
I shall just make two points. The first is that “unlawful” is just not good enough. People are still carrying on making these cold calls. Sometimes we have to listen to experts. The Law Society says that they are banned from making cold calls, and the Association of Personal Injury Lawyers is asking for a ban. Sometimes, as politicians, we need to listen to people who perhaps know more than we do. If they are asking for it, it is basically because they need this clarified. I hope that the Minister will look at this again.
As for Amendments 211 and 215, perhaps the Minister could share with me the detail of the various points just made about the sharing data with various other stakeholders. If he could write to us or share it with us, that would satisfy our position.
On that basis, I beg leave to withdraw the amendment.
My Lords, in moving Amendment 209, I will also speak to Amendment 210, and I thank the noble Lord, Lord Clement-Jones, for adding his support.
These amendments return to the major debate that we had on day 2 in Committee regarding direct marketing for the use of democratic engagement. It is fair to say that no-one was convinced by the Minister’s arguments about why that relaxation of the rules for political parties was necessary. We will no doubt return to that issue on Report, so I shall not repeat the arguments here. Meanwhile, Clause 113 leads into the democratic engagement provisions in the Bill and provides a soft opt-in for the use of electronic mail for direct marketing for charitable, political or other non-commercial activities when the data has been collected for other purposes.
As we made clear in the previous debate, we have not asked for these more relaxed rules about political electronic marketing. We believe that these provisions take us fundamentally in the wrong direction, acting against the interests of the electorate and risking damaging the already fragile level of trust between politicians and voters. However, we support extending the soft opt-in for charities and other non-commercial organisations. This is a measure that many charities have supported.
Of course, we want to encourage campaigning by charitable organisations to raise awareness of the critical issues of the day and encourage healthy debate, so extending their opportunities to use electronic marketing for this purpose could produce a healthy boost for civic engagement. This is what our amendments are hoping to achieve.
Therefore, our Amendments 209 and 210 would amend the wording of Clause 113 to remove the relaxation of the rules specifically for political parties and close the loophole by which some political parties may try to negate the provisions by describing themselves as non-commercial entities. We believe that this is the right way forward. Ideally, these amendments would be combined with the removal of the democratic engagement provisions in Clause 114 that we have already debated.
I hope noble Lords will see the sense of these proposals and that the Minister will agree to take these amendments away and rethink the whole proposition of Clauses 113 and 114. I beg to move.
My Lords, tracking the provenance of Clause 113 has been a very interesting exercise. If we think that Clause 114 is pretty politically motivated, Clause 113 is likewise. These rules relating to the fact that political parties cannot avail themselves of the soft opt-in provision have been there since 2005. The Information Commissioner issued guidance on political campaigning, and it was brought within the rules. Subsequently, there has been a ruling in a tribunal case which confirmed that: the SNP was issued with an enforcement notice and the information tribunal dismissed the appeal.
The Conservative Party was fined in 2021 for sending emails to people who did not ask for them. Then, lo and behold, there was a Conservative Party submission to the House of Lords Democracy and Digital Technologies Committee in 2020, and that submission has been repeated on a number of occasions. I have been trying to track how many times the submission has been made by the Conservative Party. The submission makes it quite clear that there is frustration in the Conservative Party. I have the written evidence here. It says:
“We have a number of concerns about the Information Commissioner’s draft code”—
as it then was: it is now a full code—
“on the use of data for political campaigning. In the interests of transparency, I enclose a copy of the response that the Conservative Party sent to the consultation. I … particularly flag the potential chilling effect on long-standing practices of MPs and councillors from engaging with their local constituents”.
Now, exactly as the noble Baroness has said, I do not think there is any call from other political parties to change the rules. I have not seen any submissions from any other political party, so I would very much like to know why the Government have decided to favour the Conservative Party in these circumstances by changing the rules. It seems rather peculiar.
The guidance for personal data in political campaigning, which I read while preparing for this debate, seems to be admirably clear. It is quite long, but it is admirably clear, and I congratulate the ICO on tiptoeing through the tulips rather successfully. However, the fact is that we have very clear guidance and a very clear situation, and I entirely agree with the noble Baroness that we are wholly in favour of charities being able to avail themselves of the new provisions, but allowing political parties to do so is a bridge too far and, on that basis, I very much support the amendment.
My Lords, I thank the noble Baroness, Lady Jones, for Amendments 209 and 210, which would amend Clause 113 by removing electronic communications sent by political parties from the scope of the soft opt-in direct marketing rule. A similar rule to this already exists for commercial organisations so that they can message customers who have previously purchased goods or services about similar products without their express consent. However, the rule does not apply if a customer has opted out of receiving direct marketing material.
The Government consider that similar rules should apply to non-commercial organisations. Clause 113 therefore allows political parties, charities and other non-commercial organisations that have collected contact details from people who have expressed an interest in their objectives to send them direct marketing material without their express consent. If people do not want to receive political messaging, we have included several privacy safeguards around the soft opt-in measure that allow people to easily opt out of receiving further communications.
Support for a political party’s objectives could be demonstrated, for example, through a person’s attendance at a party conference or other event, or via a donation made to the party. In these circumstances, it seems perfectly reasonable for the party to reach out to that person again with direct marketing material, provided that the individual has not objected to receiving it. I reassure the Committee that no partisan advantage is intended via these measures.
My Lords, perhaps the Minister could elucidate exactly what is meant by “supporting the party’s objectives”. For instance, if we had a high street petition, would that be sufficient to grab their email address and start communicating with them?
I suppose it would depend on the petition and who was raising it. If it were a petition raised or an activity supported by a particular party, that would indicate grounds for a soft opt-in, but of course anyone choosing not to receive these things could opt out either at the time or later, on receipt of the first item of material.
So what the Minister is saying is that the solicitor, if you like, who is asking you to sign this petition does not have to say, “Do you mind if I use your email address or if we communicate with you in future?” The person who is signing has to say, “By the way, I may support this local campaign or petition, but you’re not going to send me any emails”. People need to beware, do they not?
Indeed. Many such petitions are of course initiated by charitable organisations or other not-for-profits and they would equally benefit from the soft opt-in rule, but anyone under any of those circumstances who wished not to receive those communications could opt out either at the time or on receipt of the first communication on becoming aware that they were due to receive these. For those reasons, I hope that the noble Baroness will not press her amendments in relation to these provisions.
My Lords, I thank the noble Lord, Lord Clement-Jones, for digging and delving into the background of all this. That is helpful because, all the way through our previous debate, we kept saying, “We don’t understand why these provisions are here”. When the Minister in the Commons was challenged, he said, “We have no intention of using this; it’s just a general power that might be there for anyone to use”, but the noble Lord has put the lie to all that. It is clear that only one party wants to pursue this issue: the Conservative Party.
The Minister said that there is no partisan objective or reason for this but, to be honest, I do not know how he can say that. If only one party wants it and no one else does, then only one party is going to implement it. Without going over the whole of the previous debate, I think a lot of people felt that we as political parties have a lot to do to improve our relationships with the electorate and be seen to represent them on an honest and authentic basis.
This goes in the opposite direction. It is almost collecting data for one purpose and using it for a different one. The noble Lord, Lord Clement-Jones, and the Minister discussed the example of collecting information on a street stall; we have all done that a bit, in that you can put very generalised questions on a questionnaire which could then be used for all sorts of purposes.
My Lords, I also submit that Schedule 11 should not stand part of the Bill. I note the amendments from the noble Baroness, Lady Sherlock, which seek to temper the impact of these powers, but they do not go far enough. To have these clauses in a Bill labelled “data protection” contradicts its very title. I thank the noble Baroness, Lady Chakrabarti, and the noble Lords, Lord Clement-Jones and Lord Kamall, for their support. The noble Lord, Lord Anderson, is detained elsewhere but he asked that I raise a number of his concerns. I am grateful for his experience, as I am for the legal opinion provided by Dan Squires KC and Aidan Wills of Matrix Chambers.
The provisions create new powers for the DWP to obtain information about the bank accounts of people who receive benefit payments by requiring financial institutions to monitor customers’ accounts, to identify cases that merit further consideration and to establish whether the relevant benefits are being, or have been, paid in accordance with the law. Paragraph 2(1) of proposed new Schedule 3B makes it clear that the information that can be requested is very wide indeed, although it is not specified.
Schedule 11 also sets out provisions that would allow the DWP to issue account information notices; those AINs would apply to any account into which benefits will be, are being or have been paid within the past year, as well as to any account linked to such an account. The account holder may be a person who is entitled to the benefit or a person who receives the payment on their behalf, such as a parent, partner or carer. It may also include a joint account holder or, where housing benefit is paid direct, a landlord and all their related accounts.
All benefits, both those that are means tested and those that are not—child tax credit, the state pension, personal independence payments, the disability living allowance, working tax credit, universal credit and the employment and support allowance—are in scope. Counsel’s advice is that it is
“reasonable to assume that AINs will be issued on a rolling basis to most financial institutions which provide banking services and, in order to comply, financial institutions would need to subject most, if not all, of their account holders to algorithmic surveillance”.
Counsel also found that an AIN being issued to a particular financial institution would almost certainly be secret, to avoid tipping off account holders, and that the criteria triggering a search would also be kept confidential.
The Social Security Administration Act 1992 already contains powers for the Secretary of State to compel banks and others to provide information in order to ascertain whether a benefit is being paid correctly, as well as to prevent, detect and secure evidence of benefit fraud—that is to say, the DWP already has these powers if it has reasonable grounds to suspect that fraud is taking place. What is proposed is that the DWP no longer has to have a suspicion of wrongdoing but can survey vast swathes of the UK population without their knowledge in order proactively to surface cases that may or may not merit further consideration.
The legal opinion is also pretty damning on whether the powers contravene Article 8 on the possibility of extremely private information—such as on political allegiance and sexuality—being accessed, and it is equally damning on both the practicalities and the lack of oversight. If the noble Lord, Lord Anderson, had been with us, he would have made the following points. First, this is a power to collect highly sensitive personal information in bulk. Such powers exist under the Investigatory Powers Act but are attended by an array of statutory safeguards, ranging from authorisation of the original warrant, which must be approved by an independent judicial commissioner, and checks on the level of material requested to other issues such as record keeping, retention, dissemination and destruction, error reporting and a right to reply to the Investigatory Powers Tribunal. Few, if any, of these safeguards exist in the Schedule 11 power.
Secondly, the full extent and significance of the power will be apparent only once there is a code of practice. However, there is no draft code of practice and no commitment to produce one; there is merely a discretion. This is in sharp contrast to the Investigatory Powers Act, where key excerpts were made available in advance of Committee in both Houses. The impact of Schedule 11 on privacy is arguably much greater, yet we have seen no draft code of practice—indeed, we cannot be sure that a code of practice will be issued at all.
Finally, Schedule 11 contrasts with HMRC’s much more limited power to access information and documents for the purpose of checking a taxpayer’s tax position or collecting a tax debt. Under paragraph 4A of Schedule 36 to the Finance Act 2008, HMRC has been able to authorise a financial information notice on an individual, but not on a bulk basis. An FIN, unlike an AIN, must name the taxpayer to whom it relates. The most recent corporate report records that only 647 FINs were issued in the year to March 2023—an insignificant number in relation to the proposals in front of us. I hope that, when he responds, the Minister will be able to explain why investigating tax fraud is so carefully and narrowly constructed, whereas the DWP measures that will impact many more millions of people, a significant proportion of whom do not even receive benefits, are so broad.
On the day I tabled my amendments, I received an email from a woman who cares for her adult son with complex needs. She has a bank account to receive his benefits, from which she pays for his care. Under the terms of the Government’s proposal, all her bank accounts would be connected to his payments and therefore open to monitoring. Caring for an adult child is a heavy burden for a parent. Many parents do it with a love-filled grace that is humbling to witness, but it is a task that is out of season with the life that most of us live and all of us expect, in which children grow up, leave home and, as our strength wanes, come to our aid. It is also a service that the Government—and, by extension, all the rest of us—rely on.
In 2023, the University of Sheffield and Carers UK estimated that unpaid care, largely from family members, saved UK plc a whopping £162 billion a year, dwarfing the £120 million the Government expect to retrieve by these measures. It is nothing less than cruel to make a claimant or carer anxious, let alone homeless. But, if I cannot appeal to the Government’s compassion, I hope they will consider this: some who have contacted me suggested that they would no longer be prepared to continue to hold accounts on behalf of others; others suggested that their landlords would not be prepared to let them rent; and one said that their mental health had already suffered at the prospect. How many families need to put caring responsibilities back on the state, how many landlords need to make people on benefits homeless and how many people need to seek support from mental health services before the advertised gains are eroded?
For the life of me, I cannot work out whether these measures are intended to hurt or whether a focus on the shiny prospect of AI to sort out the DWP’s problems led incrementally to this place. Whichever it is, the measures are cruel to a degree that should worry us all. In a later group of amendments, we will discuss the capacity for technological systems to malfunction. Horizon might be top of mind, but Nationwide, McDonald’s, Tesco, Sainsbury’s, Greggs, 999, air traffic control and public bodies, including the NHS and DWP, have all experienced technology failures where service provision suffered.
I am not against technology—we live in a world organised by technological systems—but introducing a system that may impact the finances of up to 40% of the UK’s population, including the most vulnerable, the poorest and the oldest, without checks and balances and, indeed, while downgrading the protections on automated decision-making, is dangerous.
Can the Minister can tell the Committee what plans the DWP has for when things go wrong, when people have benefits stopped and their children go hungry because the computer says no? Can he tell us how it will prevent a repeat of the hounding of so-called fraudulent payments, as is currently being reported in relation to the carer’s allowance, until people lose homes, jobs and mental health as a result of overpayments? In many cases, they were the department’s own fault and, in one case, involved as little as 30p a week. What has the department learned from a similar Australian scheme that, over 12 months, resulted in 1 million additional welfare payments being stopped, often without warning and notified by text with no human to complain to? That scheme dissipated as it became unworkable.
My Lords, it is a pleasure and a privilege to support that tour de force from the noble Baroness, Lady Kidron. I do not need to repeat it but, to summarise, I completely agree with the opinion from Matrix Chambers that, in addition to its immorality, this provision is in contravention of Article 8 of the European Convention on Human Rights on respect for private and family life—relating to correspondence in particular. It is not necessary or proportionate, as we have heard. It is discriminatory and, for the purposes of the convention, is not in accordance with law. Once more, as we have heard, promising the possibility of guidance in future is no substitute for properly confining a power of this kind. Instead, the power is breathtaking in its scope and in its intrusive nature over the most sensitive financial and other personal information that could be gleaned this way.
It is an intrusion and an indignity as the breaches of privacy are not just for vulnerable people who are on benefits—not only non-means-tested benefits but means-tested benefits too. They are also an intrusion on the financial privacy of those who have linked accounts, whether they are a family member who is helping out by way of paying carers, landlords and so on or a family member who gives a small gift to a vulnerable person on benefits. Perhaps that is the Government’s intention—I do not know—but it is breathtaking in its sweep and in the number of citizens and people in this country who will be caught up in it. That is what makes it disproportionate and not in accordance with law relying on hypothetical guidance.
The discriminatory aspect cannot be emphasised enough. There are, broadly speaking, two categories of people for these purposes in these islands: those who earn, have inherited or otherwise have enough wealth to come within the scope of HMRC and who should pay tax and not avoid it—that is, not defraud other taxpayers and the country as a whole; and those who are on benefits, whether means-tested or universal. Neither category of humanity should be exempt from fraud but nor should there be a discriminatory approach to policing any potential fraud. Why is it that, as we heard from the noble Baroness, Lady Kidron, we have this breathtaking snooper’s charter for those on benefits but a much more targeted approach to those who should be paying taxes? That discrimination cannot be justified.
What is the difference between the trawl in looking at people who are seeking to avoid tax, which is not a crime, and in looking at those who are possibly mis-stating the extent of their assets? In the noble Baroness’s view, how is the surveillance different in terms of this Bill?
I am grateful to the noble Baroness. It is not just my view. It was put very well by the noble Baroness, Lady Kidron, and, as I recall, is outlined in the legal opinion. HMRC’s powers are more targeted and have more safeguards.
When the noble Baroness says, “more targeted”, is what way are they more targeted? That is what I would like to know.
They relate to individual people by name, not whole sweeps of people who have done nothing wrong but get a particular benefit.
What I am advocating to the Committee is that, in terms of our approach in this country to everyone in either category—or to people who are sometimes in both categories because they are, for example, entitled to some universal benefits but none the less must pay tax on their earnings, inheritance or whatever—the appropriate approach is a targeted approach beginning with at least some reasonable suspicion that a person’s financial matters are a cause for concern. Once there is reasonable suspicion—not even hard proof—because of their activities, that should be the trigger for an intrusion into their affairs. We have had that approach to privacy in this country for a very long time; it is the approach that, broadly speaking, is entrenched in Article 8 of the convention. Even if one does not like human rights conventions, it is none the less a tradition that people in this country—not just lawyers—have long understood.
Further, and in reference to the remarks attributed to the noble Lord, Lord Anderson of Ipswich—who is not in his place, which is the reason why I am also risking being sensible—it is absolutely flabbergasting that there are greater checks and balances for investigating matters of national security than for investigating what could be minor benefit fraud. An example is the allegation that the person giving a Christmas present to their pensioner relative or their relative who is not able to work should trigger a response in the algorithm that this is somebody who should no longer be worthy of the benefit or who, worse still, should face criminality or even potential incarceration.
I cannot say how horrified I am that the Government should have proceeded with a measure of this kind even as we still learn about the extent of the injustice perpetrated on the postmasters. After what we are just beginning to understand about the postmasters, I cannot understand why the Government would allow this kind of discriminatory intrusion to be turbocharged by AI and inflict the potential for the same type of injustice—not just for a limited cohort of people who were unfortunate enough to be serving their communities by working as postmasters—on millions of people in the United Kingdom.
This is what Committee on a Bill is for. I will therefore calm myself in the knowledge and belief—and certainly the hope—that, in his response, the Minister will at least offer to meet with Members of the Committee who have put their names to the clause stand part notice from the noble Baroness, Lady Kidron, and with campaigners and experts to hear a little about the detail of the concerns and to compare this provision with the other provisions, as the noble Baroness, Lady Buscombe, suggested in relation to national security, or indeed for tax fraud. Nobody is suggesting that fraud should be perpetrated with impunity, but we must learn from the mistakes of injustices already perpetrated. They are perpetrated because of blanket trust in the authorities and AI and a lack of checks and balances. There were plenty of humans in the loop at the Post Office, but that is not enough. This is a sweeping power that will lead only to intrusion, discrimination and the worst kind of injustice. In the meantime, before that moment even comes, millions of people will live in fear.
My Lords, I will address, first, the exclusion of Clause 128 and, secondly, Amendment 219 in my name.
I spoke at Second Reading to oppose Clause 128. I was a little too late to put my name to the clause stand part notice in the names of the noble Baronesses, Lady Kidron and Lady Chakrabarti, and the noble Lords, Lord Clement-Jones and Lord Anderson. I would therefore like to address a few things relating to that before I move on.
This clause creates two kinds of citizen: those who are entitled to financial privacy and others who are not entitled to any privacy, just because they happen to be poor, old, sick, disabled, infirm and unfortunate. Hopefully, the Minister can explain the rationale for creating this form of discrimination. This discrimination will particularly affect women, because a lot of women receive social security benefits, and people of colour, who are generally paid poorly and often have to rely upon universal credit and other benefits to make ends meet. Hopefully the Minister will also be able to tell us how this squares with the levelling-up agenda. Certainly this clause does not really provide any fairness at all.
I have received lots of emails and letters and met individuals who are very concerned, as earlier speakers articulated, that they will be made homeless because their landlords will not want their bank accounts to be put under surveillance. What assessment have the Government made of the impact that this clause may have on future homelessness?
My Lords, I will speak in favour of the amendment to which I have added my name, with other noble Lords here today, and also to some of the other amendments in the group. I find it interesting having to follow the noble Lord, Lord Sikka. Quite often we disagree on issues, and we are probably coming at this from different angles, but actually we have come to the same conclusion.
Noble Lords will know of my concerns raised at earlier stages about automated decision-making. We have to ensure that there is always human intervention but, even when there is human intervention, things can go seriously wrong. When I first saw this proposal for mass trawling of bank accounts, I have to say that the first thought that came into my mind was, “This is Big Brother”, so I was not surprised when I received an email and a briefing from Big Brother Watch. I thank Big Brother Watch for its point. I will quickly dip into some of the points made by Big Brother Watch. There are many more points.
People may find it interesting that the noble Lord, Lord Sikka, and I are speaking on this amendment from different angles. Let me be quite clear: I am a classical liberal. Some people call me a libertarian. I believe in a smaller state and government doing less, but there has to be a state to help those who cannot help themselves and people who have fallen on hard times. For some people, that is all they have. They have only the state benefit. There are no local community organisations or civil society organisations to help them, and therefore you have to accept that role for the state.
Those people are quite often the most vulnerable, the least represented and unable to speak up for themselves. I do not want to patronise them, but quite often you find that. When I saw this, I thought, “First of all, this is going to force third-party organisations to trawl”—I use the term advisedly—“customers’ accounts in search of matching accounts”. When we talk about those third-party organisations, we are talking about banks, landlords and a number of other organisations that have some financial relationship with those individuals. Some estimates put it at approximately 40% of the population who could be vulnerable to being trawled.
I am also worried about the precedent that this sets. I know that the noble Lord, Lord Sikka, talked about this in a different way. He would perhaps like this power to be extended. I do not want this power at all. I do not want it to be extended to others. I just do not want it at all.
I also worry about what this surveillance power does to the presumption of innocence. Are we just trawling everyone’s accounts in the hope that they will be found guilty? While I do not always agree with the Information Commissioner’s Office, we should note that it does not view these powers as proportionate.
One general concern that a number of noble Lords have is about AI and, in particular, the transparency of datasets and algorithms. We would want to know, even if we do not understand the algorithm itself, as the noble Lord, Lord Clement-Jones, and I discussed in a debate on earlier amendments, what these algorithms are supposed to be doing and what they are looking for in trawling people’s bank accounts.
There are some precedents to this. We see from financial institutions’ suspicious activity reports that they have a very high false hit rate. I have a friend who is a magistrate, who told me that she heard a case about a family who wanted to get back access to their bank account. She felt that they were under suspicion because of their ethnicity or faith, and said to the bank, “You have not made a clear case for why we should freeze this account. This family has suffered because they are not able to access their bank account”. Think about that mistake being repeated over and over with false positives from this data. The noble Baroness, Lady Kidron, was right to remind us that this is all against the background of the Horizon scandal. Even when people intervene, do they speak up enough to make sure that the victims are heard or does it need an ITV drama to raise these issues?
My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall. Although we probably come from very different positions on the role of the state, I agree with virtually everything that he said. I apologise for popping up at this late stage of proceedings on the Bill but, as someone with a long-standing concern about social security matters, I was shocked by the inclusion of these powers and want to add my support to those opposing them and, should this opposition prove unsuccessful, to the very sensible set of recommendations made by my noble friend Lady Sherlock.
The Child Poverty Action Group, of which I am honorary president, and Z2K warn that the stakes are high for claimants, as getting caught up in an error and fraud investigation can lead to the wrongful suspension and/or termination of their benefits. They give some horrendous examples of where this has happened. I will read just one: “A claimant with severe mental health problems whose main carer had recently passed away had his UC suspended in October 2023 by the UC case review when he was unable to obtain and upload bank statements on request. The suspension continued for four months and he was unable to pay for food, electricity or heating. When he was referred for benefits advice and his welfare rights adviser contacted the UC case review team, she was told that claims under review are randomly chosen and they are not targeted in any way”. This is someone with mental health problems left without any money; this could become the norm under this proposal.
The briefing from the CPAG and Z2K also cites the perspective of Changing Realities—families with experience in claiming low-income benefits. One warns that
“it will put folk off claiming altogether”.
I always remember, when I worked at the CPAG, getting a phone call from a woman who started by saying, “Please don’t think I’m a scrounger”. I am afraid that is still very much how people often feel about claiming benefits. Treating all social security recipients as potentially fraudulent can but increase the stigma associated with claiming. Amendment 219 in the name of my noble friend Lord Sikka is highly pertinent here. The point has already been made, but how would we feel if we knew that our bank accounts could well be scrutinised for potential tax evasion? I realise that I should declare an interest: as a pensioner, ultimately my bank account will be trawled, but that is down the line. Underlying this is a double standard that has operated year after year in social security and tax fraud.
The CPAG and Z2K also warn that some of the most marginalised people in our society could get caught up in these speculative searches. Given this, can the Minister explain why—I believe this is still the case—there is no equalities impact assessment for these provisions? Disabled people’s organisations are very worried about the likely implications for their members, such as in the case of disabled people who set up bank accounts to pay for their social care. They warn of the potential mental health impact as existing mental distress and trauma could be exacerbated by the knowledge that they are under surveillance—a point made by the noble Baroness, Lady Kidron.
The Government state that they
“are confident that the power is proportionate and would operate in a way that it only brings in data on DWP claimants, and specifically those claimants where there is a reasonable suspicion that something is wrong within their claim”.
Given the evidence of people already being wrongfully targeted for fraud and the strongly expressed view of organisations such as Justice, as well as the Information Commissioner, that the measures are disproportionate and therefore arguably unlawful, can the Minister say on what evidence that confidence is based? Given this confidence, I hope that the Government will accept without demur Amendments 220 to 222 in the next group from my noble friend Lady Sherlock.
Picking up what my noble friend Lord Sikka said, what is the breakdown between suspected fraud and error? It is not helpful that they are always talked about as though they are one and the same thing. The Government have argued that one reason the power is necessary is to provide the tools to enable the DWP to
“minimise the impact of genuine mistakes that can lead to debt”.
Try telling that to recipients of carer’s allowance who have been charged with fraud as a result of genuine mistakes relating to the earnings threshold. The fact that the DWP already has the information and power it needs to act to ensure that debts do not accrue in this situation, yet in countless cases has not used it until the point where very large sums may be owing, does not instil confidence, as mentioned by the noble Baroness, Lady Kidron.
On Amendment 303, which relates to Amendment 230, one of the criticisms of these provisions has been the lack of consultation. Has the Social Security Advisory Committee been consulted? If so, what was its response; if not, why not?
In conclusion, I support the opposition to Clause 128 and Schedule 11 standing part of the Bill, but so long as they do stand part, I hope very much that the Minister will take seriously the amendments in the name of my noble friend in this group and the next two.
My Lords, I was also too late to put my name to these stand part notices for Clause 128 and Schedule 11. There must have been a stampede towards the Public Bill Office, meaning that some of us failed to make it.
At Second Reading, I described Clause 128 as “draconian”. Having dug into the subject further, I think that was an understatement. Data protection is a rather dry subject and, as the debates throughout this Committee stage have shown, it does not generate a lot of excitement. We data protection enthusiasts are a fairly select group, but it is nice to see a few new faces here today.
The Bill runs to 289 pages and is called the Data Protection and Digital Information Bill. Nothing in that name suggests that around 20 pages of it relate, in effect, to giving the Government unlimited access to the bank accounts of large swathes of the population without suspicion of any wrongdoing—20 pages is larger than many Bills. I wonder what the reaction in this Committee and the other place might have been if those 20 pages had been introduced as a stand-alone Bill—called, perhaps, the government right to access bank account information Bill. I suspect that we might have had a few more people in this Room. It feels as if this draconian clause is being hidden in the depths of a Bill that the Government perhaps felt would not generate much interest. It is particularly concerning that it was dropped into the Bill at the last minute in the other place and has not, therefore, received scrutiny there either. This sort of draconian power deserves much more scrutiny than on day 6 in Committee in the Moses Room.
I hope that my desire to stamp out fraud is well known—indeed, I think I can probably describe myself as rather boring on the subject—so I have a lot of sympathy for the Government’s underlying intention here. However, a right to require banks to carry out suspicionless surveillance over the bank accounts of anybody who receives pretty much any kind of benefit, directly or indirectly, is a huge intrusion into privacy and feels completely disproportionate. Others have covered the detail eloquently, so I just want to ask a number of questions of the Minister—I see that we have had a viscount swap at this stage.
I have been trying to work out exactly which accounts could be covered by this requirement. Schedule 11 is not the easiest document to read. It seems clear that if, for example, I am a landlord receiving rent directly from the benefit system on behalf of a tenant, the account of mine that receives the money would be covered, as would any other account in my name. However, would it also catch, for example, a joint account with my wife? I think it would. Would it catch a business account or an account for a charity where I am a signatory, a director or a trustee? I am not sure from reading it, I am afraid. Can the noble Minister clarify that?
Once received, the information provided by the banks may be used
“for the purposes of, or for any purposes connected with, the exercise of departmental functions”.
That seems extremely broad, and I cannot find anything at all setting out for how long the information can be retained. Again, can the Minister clarify that?
As well as being a data protection enthusiast, I am also an impact assessment nerd. I have been trying to work out from the impact assessment that accompanies the Bill—without much success—how much money the Government anticipate recovering as a result of these proposed rights, as well as the cost to the banks, the department and any other parties in carrying out these orders. The impact assessment is rather impenetrable—I cannot find anything in it that covers these costs—so I would be grateful if the Minister could say what they are and on what assumptions those numbers are based.
The noble Lord, Lord Kamall, mentioned unintended consequences. I echo his points: this is really important. Putting additional onerous obligations on banks may make them decide that it is too difficult to provide accounts to those in receipt of benefits. Access to bank accounts for vulnerable people is already an issue, and any incentive to make that worse is a real problem. As the noble Lord pointed out, we have a good example of that with PEPs. All of us have, I suspect, experienced finding it at least difficult to open an account. Some of us have had accounts refused or even closed simply because we have made it difficult for the banks to act for us. The same risk applies to landlords. Why would a landlord want to receive money from housing benefits directly when it will mean that all of his bank accounts and linked accounts will be looked at? He will simply say no. We are therefore reducing the pool of potential accommodation available to housing benefit claimants.
Most of what needed to be said has been said excellently and clearly by the other speakers. I have just three specific questions that I urge the Minister to answer. However, an important point of context needs to be made first on the opposition of the finance industry to these proposals. It is clear and unambiguous. It could be thought that the finance industry just does not want to bothered and does not care about fraud, but in fact it is making the point that the Government have failed to come up with an overall fraud strategy. This is just a one-off idea thrown up. Some bright spark thought, “Well, we could put this into the Bill. We’ve always wanted to have this sort of overweening power. Let’s shove it in here and hope no one notices”. We need a proper fraud strategy, as other speakers have said. We lose a lot of money to fraud, so none of us are against appropriate measures to deal with it, but this is a one-off, completely ill-timed and ill-thought-out addition to the state’s powers.
I turn to my three questions. First, I have no doubt that the Minister has a predisposition to oppose the state being able to interfere in our private information—I do not doubt that that is his starting point in these discussions. The problem with this proposal is that there is no way of ring-fencing the information required for the purposes of the DWP from all the other information that is disclosed by looking at someone’s bank account. Their whole life can be laid out in their bank account and other statements. You cannot ring-fence the necessary information. This is a widespread, total intrusion into people’s privacy. Does the Minister accept that there is no way of ring-fencing the information required for the purposes of the DWP from all the other information that is available from looking at someone’s bank account?
Secondly, I have several times heard the Minister discuss improving take-up of pension credit. Does he believe that this will encourage people to claim the pension credit to which they are entitled? It will clearly discourage them. Has this been properly assessed? We know that one big reason why people do not claim pension credit is the state’s intrusion into their private affairs. People do not like it. For some people, seeing an extension of the state’s ability to intrude into their private affairs will discourage them from applying. As I say, the Minister has rhetorically encouraged people to claim their pension credit; in practice, this proposal will discourage people. Does he accept that?
Thirdly, we have three debates on this issue and I think this question may arise more in the next group, but I will ask it now, so that I can come back and ask it again later. People have referred to claimants, but this also covers the state pension. It is possible to defraud the state pension, but it is nevertheless an income. Pension or income—whatever you call it; I do not think we should get too hung up on the vocabulary—it is paid as a right and people are entitled to these benefits.
One of the other theories about our state system is about identical benefits. Some people, like me, who have never been contracted out of the state scheme, have a full state pension, but a lot of people were contracted out into private schemes and personal pensions. Now, because I have that state pension, the state can intrude into my bank account. The state is paying me the pension; it can look at my bank account under these provisions.
However, if my pension were payable by Legal & General Assurance Society or the BP pension fund, they would not have the right to demand access to my bank accounts. I am just pointing out that we would react in horror if this Act gave power to the BP pension fund to trawl through my bank accounts. We would react in horror if we were giving power to Legal & General Assurance Society to go through my bank accounts, yet the Government believe that the state should have this overweening power. Does the Minister accept that and does he think that it is wrong?
My Lords, I speak as someone who was a Minister at the Department for Work and Pensions back in 2017. I well remember, when I was in charge of fraud and benefit, when we had a new addition to my team. I felt very strongly about this area because, when I first started as a Minister there, I was incredibly shocked by the level of fraud. Someone talked about having a fraud strategy, but this area is very complex. In the years since then, we have learned that the greatest incidence of fraud is people misstating their assets. Everybody in the Room will know that it is important that you must have only a certain amount of assets to claim benefits, whatever your situation, unless they are not means-tested or are disability benefits.
In 2017, the Treasury ran a controlled pilot. I do not know the details of how it was run, but I saw the results and they were extraordinary. The pilot was at one bank, using the powers they already had, for those who may be avoiding tax—which of course is not a crime—to see whether there was an issue with regard to benefit claimants misstating the extent of their assets when claiming. The extraordinary thing was that they found that between 25,000 and 30,000 at that one bank alone were misstating their assets.
So we know that there is a real problem here, and we know that fraud itself has gone up and up. We are unable to calculate all fraud in the system because, under the legacy system, we found it difficult to check the degree of housing benefit and so on. Maybe it is easier now under universal credit—I hope my noble friend the Minister will be able to tell us that it is—to check people in receipt of benefits who claim to be living alone when they are not.
This is a very nuanced area, but all I can say is that we knew we had a major problem with people misstating their assets. We had to deal with that, but we could not do so without working out how to do so with care, bearing in mind all the issues that noble Lords have raised today about doing it in a proportionate way, in a way that does not conflict with human rights in a way that does not become mass surveillance for everyone. We should bear in mind that since 2011 taxpayers, the people actually funding the benefits system, including some benefit claimants themselves, have had their bank accounts checked to make sure that they are not avoiding tax, which is not a crime—I am talking not about evasion but about avoiding—while fraud in the benefits system is a crime.
We need to be quite careful. Some of the things that have been said today conflating this issue with Horizon are wrong. I have been reading the so-called facts that some of these lobbyists have written about how the clause is disproportionate and unfair and goes too far in terms of people’s privacy. The Department for Work and Pensions works tirelessly to try to do the right thing in the right way. This has not been thrown into the Bill at the last minute as if we have just dreamed it up. That discovery was seven years ago. The noble Lord, Lord Sikka, may laugh, but I do not see the relevance of an awful lot of what he was saying—about the noble Baroness, Lady Mone, and so on—to what we are discussing now.
The reality is that benefit fraud is a serious offence, depriving those who need it most of vital support. A lot of people have come up with cases of very difficult situations that people have to live through. Those are the people we want to support but, frankly, the bill at DWP for this one year is £290 billion. When I was there in 2019, it was £190 billion. We cannot afford to put up with benefit fraud, so we have developed this carefully constructed measure, which needs to be thought through with care. I am sure my noble friend will be able to answer a lot of the questions that have quite rightly been asked today in Committee.
The noble Baroness mentioned lobby groups that say the clause is disproportionate. The Information Commissioner has questioned the proportionality of this measure. Does she consider the Information Commissioner a lobby group?
No. With respect, I am talking about Justice, which I think referenced 40 organisations. There was no list of what those organisations are in the information it sent me. There is also Big Brother Watch and many others.
I just think that everyone needs to take, if I may use the word, a proportionate approach to this. We are talking about tackling a really serious offence. I think all noble Lords agree that we have to tackle fraud but I am sure, and hope, that my noble friend can reassure everybody. The current powers that the DWP has to ensure benefit correctness are mostly over 20 years old. Over that time, fraud has evolved and become increasingly sophisticated. The system currently relies on self-verification for many factors, and that is one of the issues. I know it would sound so much better if people could find another way to check whether someone is being honest about their assets, but the problem is that a lot of this is to do with self-verification.
The suggestion was made that this was carefully thought out and part of a long-term plan. Can the noble Baroness therefore explain why it was introduced into the Bill at such a late stage in going through the Commons, such that it did not receive any worthwhile consideration at all there?
I am sure my noble friend the Minister can talk about the particular timing of why it went into this Bill. Certainly in my time at DWP, the difficulty we had was finding the right Bill that we could add it to. This is one of the things that is really hard about being a Minister: you cannot just say, “This is something we have to do”. You have to find a route—like finding a route to market—to include a measure in a Bill that is relevant. This Bill is entirely relevant in terms of where we are now on data collection. The Minister and his team were right to choose this particular Bill.
I could go on.
I am sorry; I have spent a lot of time listening to others, and a lot of it has been slightly interesting to listen to, I have to say.
The measure will not enable the DWP to access any accounts, and the DWP will not be able to use this measure to check what claimants are spending. The DWP can request information only where there is a link between the DWP, the third party and the benefit claimant or recipient of a payment, and will receive only minimum information on those cases where potential fraud and error are signalled. Once received, the DWP will look at each case individually through its business-as-usual processes and by using existing powers. That work will carefully be undertaken by a human and no automated decisions will be made. That is a really interesting and important point in terms of this measure. I now turn to my noble friend.
I am grateful to the noble Baroness, but could she point out where those restrictions actually are in the Bill? It says that an account information notice can include
“the names of the holders … other specified information relating to the holders … and … such further information in connection with those accounts as may be specified”.
It basically allows the DWP to ask for any information relating to those accounts. I do not see the restrictions that she has just spoken about.
It is important that my noble friend answers that question. The point is that if we find—I am sorry, I still speak as if I am involved with it, which I am not, but I promise noble Lords that I have spent so much time in this area. If the DWP finds that there is a link that needs pursuing then that obviously has to be opened up to some degree to find what is going on. Remember, the most important thing about this is that the right people get the right benefits. That is what the Government are trying to achieve.
My Lords, I note that the DWP has been passed a parcel by the Department for Science, Innovation and Technology—and I am not at all surprised. I am sure it will be extremely grateful to have the noble Baroness, Lady Buscombe, riding to its defence today as well. Also, attendance at this debate demonstrates the sheer importance of this clause.
We on these Benches have made no secret that this is a bad Bill—but this is the worst clause in it, and that is saying something. It has caused civil society organisations and disability and welfare charities to rise as one against it, including organisations as disparate as UK Finance, mentioned by the noble Lord, Lord Davies, and the ICO itself. They have gone into print to say that, for this measure to be deemed a necessary and proportionate interference in people’s private lives, to be in accordance with the law and to satisfy relevant data protection requirements, legislative measures must be drafted sufficiently tightly—et cetera. They have issued a number of warnings about this. For a regulator to go into print is extremely unusual.
Of course, we also have Big Brother Watch and the Child Poverty Action Group—I pay tribute to the noble Baroness, Lady Lister—the National Survivor User Network, Disability Rights UK, the Greater Manchester Coalition of Disabled People and the Equality and Human Rights Commission. We have all received a huge number of briefings on this. This demonstrates the strong feelings, and the speeches today have demonstrated the strong feelings on this subject as well.
There have been a number of memorable phrases that noble Lords have used during their speeches. The noble Baroness, Lady Kidron, referred to a “government fishing expedition”. The noble Baroness, Lady Chakrabarti, called it “breathtaking in its scope”. I particularly appreciated the speech of the noble Lord, Lord Kamall, who said, “What happened to innocence?” In answer to the noble Baroness, Lady Buscombe, this is not “nuanced”: this is “Do you require suspicion or do you not?” That seems to me to be the essence of this.
I was in two minds about what the noble Lord, Lord Sikka, said. I absolutely agree with him that we need to attack the fat cats as much as we attack those who are much less advantaged. He said, more or less, “What is sauce for the goose is sauce for the gander”. The trouble is that I do not like the sauce. That was the problem with that particular argument. The noble Baroness, Lady Lister, talked about stigma. I absolutely agree. The noble Lord, Lord Vaux, more or less apologised for using the word “draconian” at Second Reading, but I thought the word “overreach” was extremely appropriate.
We have heard some powerful speeches against Clause 128. It is absolutely clear that it was slipped into the Bill alongside 239 other amendments on Report in the Commons. I apologise to the Committee, but clearly I need to add a number of points as well, simply to put on record what these Benches feel about this particular clause. It would introduce new powers, as we have heard, to force banks to monitor all bank accounts to find welfare recipients and people linked to those payments. We have heard that that potentially includes landlords and anyone who triggers potential fraud indicators, such as frequent travel or savings over a certain amount. We have seen that the impact assessment indicates that the Government’s intention is to “initially”—that is a weasel word—use the power in relation to universal credit, pension credit and employment support allowance. We have also heard that it could be applied to a much wider range of benefits, including pensions. The Government’s stated intent is to use the power in relation to bank accounts in the first instance, but the drafting is not limited to those organisations.
Of course, everyone shares the intent to make sure that fraudulent uses of public money are dealt with, but the point made throughout this debate is that the Government already have power to review the bank statements of welfare fraud suspects. Under current rules, the DWP is able to request bank account holders’ bank transaction details on a case-by-case basis if there are reasonable grounds to suspect fraud. That is the whole point. There are already multiple powers for this purpose, but I will not go through them because they were mentioned by other noble Lords.
This power would obviously amend the Social Security Administration Act to allow the DWP to access the personal data of welfare recipients by requiring the third party served with a notice, such as a bank or building society, to conduct mass monitoring without suspicion of fraudulent activity, as noble Lords have pointed out. Once issued, an account information notice requires the receiver to give the Secretary of State the names of the holders of the accounts. In order to do this, the bank would have to process the data of all bank account holders and run automated surveillance scanning for benefit recipients, as we have heard.
New paragraph 2(1)(b) states that an account information notice requires,
“other specified information relating to the holders of those accounts”,
and new paragraph 2(1)(c) refers to other connected information, “as may be specified”. This vague definition would allow an incredibly broad scope of information to be requested. The point is that the Government already have the power to investigate where there is suspicion of fraud. Indeed, the recently trumpeted prosecution of a number of individuals in respect of fraud amounting to £53.9 million demonstrates that. The headlines are in the Government’s own press release:
“Fraudsters behind £53.9 million benefits scam brought to justice in country’s largest benefit fraud case”.
So what is the DWP doing? It is not saying, “We’ve got the powers. We’ve found this amount of fraud”. No, it is saying, “We need far more power”. Why? There is absolutely no justification for that. No explanation is provided for how these new surveillance powers will be able to differentiate between different kinds of intentional fraud and accidental error.
We have heard about the possibility and probability of automated decision-making being needed here. I do not know what the Minister will say about that, but, if there will not be automated decision-making—that is concerning enough—if the DWP chooses to make these decisions through human intervention the scale of the operation will require a team so large that this will be an incredibly expensive endeavour, defeating the money-saving mandate underpinning this proposed new power, although, as a number of noble Lords have pointed out, we do not know from any impact assessment what the Government expect to gain from this power.
It is wholly inappropriate for the Government to order private banks, building societies and other societies and financial services to conduct mass algorithmic suspicionless surveillance and reporting of their account holders on behalf of the state in pursuit of these policy aims. It would be dangerous for everyone if the Government reversed the presumption of innocence. This level of financial intrusion and monitoring affecting millions of people is highly likely to result in serious mistakes and sets an incredibly dangerous precedent.
This level of auditing and insight into people’s private lives is a frightening level of government overreach, in the words of the noble Lord, Lord Vaux, more so for some of the most marginalised in society. This will allow disproportionate and intrusive surveillance of people in the welfare system. In its impact statement, the DWP says it will ensure that data will be
“transferred, received and stored safely”.
That is in contrast to the department’s track record of data security, particularly considering that it was recently reprimanded by the ICO for data leaks so serious that they were reported to risk the lives of survivors of domestic abuse. With no limitations set around the type of data the DWP can access, the impact could be even more obscure.
We have heard about the legal advice obtained by Big Brother Watch. It is clear that, on the basis that,
“the purpose of the new proposed powers is to carry out monitoring of bank accounts”
and that an account information notice can be issued
“where there are no ‘reasonable grounds’ for believing a particular individual has engaged in benefit fraud or has made any mistake in claiming benefits”,
this clause is defective. It also says that
“financial institutions would need to subject most if not all of their accountholders to algorithmic surveillance”;
that this measure
“will be used not just in relation to detection of fraud but also error”;
and that this measure
“would not be anchored in or constrained by anything like the same legal and regulatory framework”
as the Investigatory Powers Act. It concludes:
“The exercise of the financial surveillance/monitoring powers contained in the DPDIB, as currently envisaged, is likely to breach the Article 8 rights of the holders of bank accounts subject to such monitoring”
in order to comply. It is clear that we should scrap this clause in its entirety.
My Lords, I thank the noble Baroness, Lady Kidron, and my noble friend Lord Sikka for introducing their amendments. I also thank all noble Lords who have spoken. I will speak to Amendments 223, 299, 302 and 303 in my name. I should probably say at this point that I am late to this party but, unlike the noble Lord, Lord Vaux, I am not a data protection specialist, I am afraid. However, I am a social security nerd, so I am here for this bit right now.
Since this is the first part of the Bill on DWP powers to tackle fraud, I need to add my little statement on the “fraud is bad” move. Fraud is a problem and has been getting worse across this Government. There have been scandals in procurement, of which the infamous PPE contracts are just one example. There is tax due that goes unpaid at scale and, in social security, the percentage of benefit expenditure lost to fraud has been rising under this Government. However, as my honourable friends made clear in the Commons, a Labour Government would take fraud seriously and pursue all those who seek to take money fraudulently or illegally from the state. They would also focus on helping people to avoid inadvertent overpayments rather than just waiting for them to make mistakes then coming down hard on them at that point. This should not need saying but, in some of the discussions on this Bill elsewhere, there has been a tendency to frame the debates rather along the lines of a classical fallacy: “Fraud is really bad. This will tackle fraud. Therefore, this must be really good”. I know that we are fortunate that in the Minister we have someone who is able to have a much more nuanced debate. I look forward to having exchanges in a way that recognises the important role of this House in scrutinising the powers that the Executive want to take unto themselves, which is exactly what Committees in the House of Lords do so well.
Scrutiny particularly matters here because, as the noble Lord, Lord Vaux, and my noble friend Lord Davies pointed out, all these amendments—more than 200 amendments, 38 new clauses and two new schedules—were introduced on Report in the Commons. My honourable friend Chris Bryant tried to recommit the Bill so that the Commons could discuss it, but the Government refused. The interesting thing is that in their anti-fraud plan back in May 2022, the Government announced that they planned to boost the DWP’s powers to get information from third parties when parliamentary time allowed. The noble Baroness, Lady Buscombe, made a fair point that departments have to wait for the right Bill to come along in order to use it, but the Government have known about this since 2022. They have had two years to draft the amendments, so although they might have had to wait for the Bill to come along, that does not seem a good enough reason for them to have waited until Report in the Commons to deposit them into the process. I hope the Minister will be able to explain the reasons for that.
My noble friend Lady Chakrabarti and others have asked some important questions about the scale on which these powers will be used; I am going to come back to that in our debate on the next group. It is hard to know the scale from the information we have so far, but DWP clearly does know, or has a sense of it, because paragraph 85 of the impact assessment states:
“Using our model to estimate volumes of hits for this measure, over the 10-year appraisal period, internal analysis has estimated that in total there will be an additional 74,000 prosecution cases, 2,500 custodial sentences and 23,000 applications for legal aid”.
It has modelled the volume of matching hits that would require investigation. Can the Minister tell the Committee what that number is? Also, what assurance can he give us that DWP has the resources to investigate that number of hits in a timely manner?
Paragraph 2 of new Schedule 3B says that the account information notices can only cover data going back a year and that they must be done in the week before they are given to DWP. Is there any time limit on how long DWP has to act on the results that have been handed over to it?
I turn now to the amendments in my name. Some of them are quite detailed because these powers are astonishingly wide and it is not at all clear how they could be used. I have deliberately tabled a series of amendments—in three groups in order to make sure that we have a chance to go into detail—to try to get information out of the Government and find out what this is about.
Amendment 223 is a minor probing amendment that would delete paragraph 3(1) of new Schedule 3B, which Schedule 11 to the Bill would insert into the 1992 Act. I will not rehearse it here but can the Minister explain what that provision is for and what its limits are? Neither I nor the people I have spoken to in financial services can understand why it is needed.
The noble Baroness, Lady Kidron, and others mentioned the fact that the Information Commissioner said he could not provide to Parliament his assurance that this measure is proportionate. My other amendments in this group are therefore designed to try to understand the impacts better. Amendment 302 would prevent these new powers coming into force automatically, while Amendment 303 would require the Secretary of State to fulfil several requirements before laying regulations to commence the powers. Amendment 299 is a minor consequential amendment. The effect of this is that the Secretary of State would have to issue a call for evidence, to inform the creation of the first code of practice, and consult relevant bodies. They would also have to lay before Parliament statements on key issues, of which I will highlight two.
The first would say whether and how AI will be used in exercising these powers, as well as how those proposals will take account of protected characteristics; this was touched on by my noble friend Lady Lister and others. That benefits often engage protected characteristics is in the nature of social security. Sickness and disability benefits engage disability, obviously; pensions engage age; benefits relating to children may engage age and also indirectly engage sex; and so on. The National Audit Office has warned that machine learning risks bias towards certain vulnerable groups and people with protected characteristics. So, what external governance or oversight is there to ensure that, once data are collected on the scale envisaged here, we do not end up with a mass breach of equality law?
The second issue I want to highlight concerns the provision that will be made to ensure that individuals subject to investigation do not experience hardship during it or lasting detriment afterwards. Given the comments of my noble friend Lady Lister about the cases from CPAG, can the Minister say whether a claimant’s benefits will be kept in payment while they are investigated following the data that are surfaced as a result of these trawls?
I am concerned that, given the potential scale of hits, a claimant who had, say, inadvertently breached the capital limit but then found themselves at the back of a long queue to be investigated could find themselves ending up paying back really large sums. The Minister will be aware of the recent media coverage, which others have mentioned, of how the DWP is treating people who were overpaid the carer’s allowance, a benefit that gives £81.90 a week to people providing at least 35 hours a week of unpaid care. It is a cliff-edge benefit—if your net earnings are under £150 a week, you get the lot; if they are over it, you get nothing—so a small rise in the minimum wage or a change in tax thresholds or rates can be enough to make someone entirely ineligible overnight, even if nothing changes in their circumstances.
As my noble friend Lady Lister said, apparently, DWP’s IT systems can flag when a carer’s income breaches the threshold but it does not necessarily do that, allowing them then to rack up potentially thousands of pounds’ worth of overpayments. The Guardian has investigated this issue; I shall mention two cases that it offered. First, an unpaid carer with a part-time charity job unknowingly breached the threshold by an average of £4.40 a week—£58 in total—caused by the automatic uprating of the national minimum wage. Because that left her not eligible for anything, she ended up being told to repay £1,715, including a civil penalty.
In the second example, a woman caring for her husband with dementia and Parkinson’s was told to repay nearly £4,000 for inadvertently exceeding the earnings threshold by calculating earnings from her zero-hours job on a monthly basis, as she thought the rules required, rather than a four-weekly basis, which they actually do; the rules around allowable costs and earnings are quite complicated. Crucially, according to the Guardian, she was told that, if she appealed, it could cost her even more. The Guardian quotes from a DWP letter telling her that, if she challenged the repayment order,
“the entire claim from the date it started will be looked at, which could potentially result in the overpayment increasing”.
Is that standard practice? Is DWP currently acting on all the alerts it receives of overpayments? If these powers are switched on, what safeguards will there be when that happens to protect millions of people from ending up paying back years of overpayments that DWP could have prevented?
Before embarking on investigations on this scale, we need to understand more about how this measure will work. We have had some excellent questions in Committee from the noble Lord, Lord Vaux, and others; I look forward to the Minister’s reply.
My Lords, I thank all those who have spoken today. I have been made well aware of the strong views expressed about this measure in Committee. I thank the noble Baroness, Lady Sherlock, for her kind remarks. She is right: I take all these matters extremely seriously. I have listened carefully to all the speeches, although I might not agree with them. Many questions have been asked. I will attempt to cover them all, of course; I doubt that I will be able to but I assure noble Lords that it is likely that a long letter will be required after this. Obviously, I will reflect on all the speeches made in Committee today.
I start by talking about the timing of the introduction of this measure. The noble Baroness, Lady Sherlock, said that the measure was introduced, in her words, “on the late side”. As she alluded to, the DWP published the Fraud Plan in May 2022, where it outlined a number of new powers that it would seek to secure when parliamentary time allowed. In answer to her question and others, in the parliamentary time available, the DWP has prioritised our key third-party data-gathering measure, which will help it tackle one of the largest causes of fraud and error in the welfare system. We will not sit back and ignore an opportunity to bring down these unacceptable losses and better protect taxpayers’ money. I will expand on all of that later in my remarks.
Before attending to the themes raised and addressing the amendments, it is important to set out the context for the power for which we are legislating. Fraud is a serious and damaging UK-wide issue, accounting for more than 40% of all crime. To be fair, many speeches alluded to that. The welfare system is also a target for fraudsters, and we are seeing increasingly sophisticated attacks occur on a scale that we have not seen in the past. We all have our own experiences at home of fraudsters who try completely different methods, not linked to the benefits system at all, to try to gain money through ill-gotten uses and methods.
In 2022-23, the DWP paid out more than £230 billion in benefits and payments to people across Great Britain. I very much took note of the figure that my noble friend Lady Buscombe raised. I say to the Committee that this figure is forecast to rise to nearly £300 billion by 2024-25, in quite short order, so this is a really serious issue to address. However, more than £8 billion has been overpaid in each of the past three years because of deliberate fraud against the state or because genuine errors have been made.
To assist the noble Baroness, Lady Lister, to whose speech I listened carefully, fraud, not error, is the biggest cause of welfare overpayments, totalling £6.4 billion of the £8.3 billion overpaid last year. The noble Lord, Lord Vaux, also asked about the figures. These losses are largely because people are intentionally and knowingly taking money that they are not entitled to. This is not organised fraud either; the vast majority comes from individuals who are not entitled to the money. We cannot underestimate the lengths to which some will go in order to take money they are not entitled to or promote ways to defraud us to a wider audience. This new legislation is not just about protecting the taxpayer; it will help those who make genuine mistakes in their claim, and our swift action will avoid them building up large overpayments.
Some people have said that the department has the powers that it needs to fight fraud and error—I think that was alluded to even today. However, some of the current powers that we have to ensure benefit correctness are over 20 years old—a point that I think my noble friend Lady Buscombe made. In this time, fraud has evolved and become increasingly sophisticated and we must keep pace with the fraudsters. It is for this reason that the Government are bringing these new third-party data powers, as set out, as said earlier, in the fraud plan.
I apologise for interrupting, but can the Minister show us in the Bill where those restrictions on the information that can be requested reside? As I read it, as I mentioned to the noble Baroness, Lady Buscombe, paragraph 2(1) of new Schedule 3B, as inserted by Schedule 11 of the Bill, is pretty wide when it refers to
“names of holders … other specified information relating to the holders … and … such further information in connection with those accounts as may be specified”.
So it appears that the DWP can ask for whatever it wants, rather than what the Minister just described.
That is a fair challenge and I will certainly be coming on to that. I have in my speech some remarks and a much more limited reassurance for the noble Lord.
It is only when there is a signal of potential fraud or error that the DWP may undertake a further review, using our business-as-usual processes and existing powers—an important point. DWP will not share any personal information with third parties under this power, and only very limited data on accounts that indicate a potential risk of fraud or error will be shared with DWP in order to identify a claimant on our system. As I said earlier, I will say more about the limited aspects of this later in my remarks.
I am sorry to interrupt the Minister, but will he be coming on to explain what these signals are? He is almost coming to a mid-point between innocence and suspicion called “signals”—is this a new concept in law? What are we talking about and where in all of Schedule 11 is the word “signal”?
If the noble Lord will allow me, I would like to make some progress and I hope that this will come out in terms of what we may be seeking on a limited basis.
The first third parties that we will designate will be banks and other financial institutions, as the Committee is aware. We know that they hold existing data that will help to independently verify key eligibility factors for benefits.
This clause does not give DWP access to any bank accounts—a very important point—nor will it allow DWP to monitor how people spend their money or to receive sensitive information, such as medical records or data on opinions or beliefs.
As the noble Baroness, Lady Sherlock, mentioned—I want to try to answer one of her questions—this power cannot be used to suspend someone’s benefit. Cases that are flagged must be reviewed under existing processes and powers—business as usual, which I mentioned earlier—to determine whether incorrect payments are being made.
Our approach is not new. HMRC has long been using powers to request data at scale from banks on all taxpayers under Schedule 23 to the Finance Act 2011. Our approach carries similar safeguards. Tax fraud is no different from welfare fraud and should be treated similarly. This was a key point that the Prime Minister made only on Friday when he committed to bring DWP’s fraud and error powers more in line with those of HMRC. This is one clear area where we are seeking to do this.
This allows me to go on to very important points about safeguards. Not all the cases found through this power will be fraud. Some will be errors which the power will help to correct, preventing overpayment debt building up. Some cases may also have legitimate reasons for seemingly not meeting eligibility requirements, for example where claimants have certain compensation payments that are disregarded for benefit eligibility rules. In those cases, no further action will be taken. Our robust business-as-usual processes will ensure that all cases are dealt with appropriately.
Another question raised by the noble Lord, Lord Vaux, on safeguards was to do with the legislation. A key safeguard is that we cannot approach any third party either; there must be a three-way relationship with the department, the claimant and the third party. This safeguard will narrow the use of this power substantially and ensure that it is used proportionately, as these three-way relationships are limited, meaning that data cannot be gathered at scale from just any source for any purpose. Any third party we will want to get data from will need to be designated in affirmative regulations that noble Lords will have an opportunity to scrutinise. These regulations will be accompanied by a code of practice. We will be bringing that forward, and we will consult on the code before presenting it to Parliament—which answers a question raised by, I think, the noble Baroness, Lady Kidron.
The power also ensures that we can request only very limited data on benefit recipients. I think this addresses a point raised by the noble Lord, Lord Vaux. We must work with key third parties to define what is shared, but our expectation is that this would be a name and date of birth or a unique payment number, along with the eligibility criteria someone has matched against: for example, a benefit claimant who has more savings than the benefit rules would normally allow.
Outside controls will apply here, too. DWP already handles vast amounts of data, including personal data, and must adhere to the UK GDPR and the Data Protection Act 2018.
On the point, which again was raised during this debate, about the remarks made by the Information Commissioner’s Office and its updated report on this measure, published as Committee started and which the Committee may be aware of, I was pleased to see that the commissioner now acknowledges that the third-party data measure is in pursuit of a legitimate aim, stating:
“I understand and recognise the scale of the problem with benefit fraud and error that government is seeking to address and accept that the measure is in pursuit of a legitimate aim. I am not aware of any alternative, less intrusive, means of achieving the government’s stated policy intent based on their analysis”.
I think that is a significant point to make, and it is a point with which I very strongly agree.
It is also worth pointing out that the paragraph I quoted follows immediately on that. That is the qualification that I quoted.
Yes, I am aware of that. I think the noble Lord was alluding to the point about proportionality. I listened carefully and took note of that, but do not entirely agree with it. I hope that I can provide further reassurances, if not now then in the coming days and weeks. The point is that there is no other reasonable way to independently verify claimants’ eligibility for the payment that they are receiving.
I turn to the amendments raised, starting with the stand part notice from the noble Baronesses, Lady Kidron and Lady Chakrabarti, the noble Lord, Lord Anderson of Ipswich, who is not in his place, and the noble Lord, Lord Clement-Jones. They and my noble friend Lord Kamall, who is not in his place, interestingly, all made their case for removing the clause, of which I am well aware. However, for the reasons that I just set out, this clause should stand part of the Bill.
In raising her questions, the noble Baroness, Lady Kidron, made some comparisons with HMRC. There are appropriate safeguards in place for this data-gathering power, which will be included in the code of practice. The safeguards for this measure will be equivalent to those in place for the similar HMRC power which Parliament approved in the Finance Act 2011.
When might we see the code of practice? It would be extremely helpful to see it before Report, as that might short-cut some of these discussions.
I will need to get back to the noble Lord on that, but perhaps can reassure him that it is already being worked on. You can imagine that, because of the sensitivity of these powers, we are working very carefully on this and making sure that it will be fit for purpose.
Can we see the draft code of practice before Report?
That is part of the answer that I gave to the noble Lord, Lord Vaux, which I think is a fair point.
The noble Baroness, Lady Kidron, asked about the code of practice and what steps my department will take to ensure transparency and accountability in the exercise of these powers if they are implemented. In the primary legislation, we will make provision to publish the code of practice, which will set out general guidance on how the third-party data power will work, as I have mentioned. We will develop the code of practice with relevant third parties and it will be consulted on publicly before being laid in Parliament. We will explain what the expectation is for data holders and ensure full compliance for the DWP. This will provide assurance that we will operate transparently and mirror the approach that we have taken with other DWP powers. Any changes to the code of practice, other than minor changes, will also be done in consultation with stakeholders.
The noble Baroness, Lady Kidron, stated that the power was too broad and the gist of one of her questions was that there is no need for all these benefits to be in scope. As the noble Baroness has demonstrated, there is a wide range of benefits and therefore potential avenues for fraudsters to seek to exploit or for error to creep in. That is why it is important that the power enables the department to respond proactively, as new fraud risks emerge.
That said, as the noble Baroness knows, the power will not be exercisable in all the benefits that she listed, such as child benefit, because the legislation is drafted in such a way that it could reasonably be exercised only in relation to benefits for which the Secretary of State is responsible. I reassure the Committee that using Section 121DA of the Social Security Administration Act 1992 is a consistent approach that we take to defining benefits in this way to safeguard all existing legislation and account for a benefit being, for example, renamed or amended. It should be stressed that the listing of a benefit does not mean that this power can or will be exercised upon it. The conditions in the third-party data legislation must still apply, and therefore not all benefits will be subject to this measure. That is a very important point.
I would be convinced about the Government’s intentions, and would not press this amendment at the next stage, if the Minister can name just one big accounting firm which since 2010, as a result of a court judgment that said it was selling unlawful tax avoidance schemes, has been investigated, fined or prosecuted. If he can give me such an example then I will be convinced that the Government are seriously tackling tax fraud and its enablers.
The noble Lord has set me quite a challenge at the Dispatch Box. It is out of scope of today’s session but, having said that, I will reflect on his question afterwards.
I am aware that time is marching on. My noble friend Lord Kamall asked about burdens on banks. We believe that the burdens on banks will be relatively low.
The noble Baroness, Lady Sherlock, made a number of points; I may have to write to her to expand on what I am about to say. Removing the requirement for third parties to provide legible copies of information means that DWP could receive the information but there is a risk that the information is not usable; that is my answer to her points. This could limit the data that DWP receives and prevent us utilising the power in full, which could in turn impact the savings due to be realised from this important measure.
I turn to the final amendments in this group, which were raised by the noble Baroness. They would place requirements on the Secretary of State to issue statements in the House and consult on the code of practice. We will talk more about the code of practice later on in this debate, and I have already made clear my firm opinions on it: we will take it forward and are already working on it. There will be a consultation that will, of course, allow anybody with an interest in this to give their views.
I turn to the number of statements that must be made in the House regarding the practical use of the measures before powers can commence, such as the role that artificial intelligence will play or assurances on any outsourcing of subsequent investigations. This is an important point to make and was raised by other Peers. I want to make it clear that this measure will be rolled out carefully and slowly through a “test and learn” approach from 2025, in conjunction with key third parties. To make these statements in the House would pre-empt the crucial “test and learn” period. I say again that discussions with the third parties are deep and detailed and we are already making progress; this point was made by the noble Lord, Lord Clement-Jones, on the link with banks and third parties.
Importantly, I assure the noble Baroness, Lady Sherlock, that we will not make any automated decisions off the back of this power; this was also raised by the noble Baroness, Lady Kidron. The final decision must and will always involve a human being—a human agent in these cases—and any signals of potential fraud or error will be looked at comprehensively. I am grateful for the remarks of my noble friend Lady Buscombe on this matter.
I know that I have not answered a number of questions. Perhaps I can do so in our debate on another group; otherwise, I certainly wish to answer them fully in a letter. I hope that I have explained clearly, from our perspective, why this power is so important; why it is the right power to take; and how we have carefully designed it, and continue to design it, with the key safeguards in mind. I strongly value the input from all those who have contributed today but I remain unconvinced that the proposed amendments are necessary and strengthen the power beyond the clear safeguards I have set out. With that, I hope that the noble Baroness will not press her opposition to Clause 128.
I may have missed something, but can I just check that the Minister will deal with the matter of signals, which he mentioned at the beginning of his response? Will he deal with where that phrase comes from, what they are, whether they will be in the code, et cetera? There are a lot of questions around that. Does it amount to actual suspicion?
Absolutely; I am keen to make sure that I answer on that. It may be possible to do so in the next group but, if not, I will certainly do so in the form of a precise letter—added to the larger letter that I suspect is coming the noble Lord’s way.
A number of pensioner groups are watching these proceedings. I have received some messages. They are asking, “When is the Minister going to answer the questions asked about the operation of the surveillance of recipients of the state pension, especially those who have foreign accounts?” I assume that the Minister will clarify that in any subsequent letter to me.
Absolutely; the noble Lord will know that I have not managed to answer all the questions. I have tried to bring in everybody on this important and serious debate. The answers will be forthcoming.
I thank my noble friend very much for all the explanation that he has given thus far. I just want to add a word that has not been mentioned: deterrent. One of the reasons why the Government have sought to introduce this in the Bill, I believe, is that it is hugely important that we are much more thoughtful about what will stop people doing the wrong thing. It has become an old-fashioned word but, from a legal, practical and moral standpoint, does my noble friend agree that this is a practical deterrent to make sure that people do the right thing?
Is it not one of the dangers that this is a deterrent to people claiming these benefits?
I have a response to the question from the noble Lord, Lord Clement-Jones, about signals. The signal is where the criteria or rules for benefit eligibility appear not to be met, and Parliament will have agreed those rules.
My Lords, the Committee will be grateful to hear, I hope, that I will not try to capture such a rich conversation. I thank the Minister for his careful listening and consideration. I will read carefully what was said at the Dispatch Box and what is about to be said during our discussion on the next two groupings because, without seeing all that in the round, I cannot truthfully say whether the questions asked by noble Lords have been answered.
I share a little of the concern that I can see agitating the noble Lord, Lord Clement-Jones, about the words “signals”, “criteria” and “codes”, which are not promised in the Bill but are suddenly appearing. Indeed, the Minister will remember that, in a private meeting, we talked about how those criteria might be gamed and, therefore, how detailed they could possibly be. There may still be some differences of opinion, and possibly differences of practice, that need to be worked out.
Of course, for now, I will not press my opposition to Clause 128 standing part. I welcome further conversation between now and Report but, I have to say, I lost count of the number of times noble Lords have said “proportionate” in this debate and how many times the issues of scope, sweeping powers and so on were stated by some very expert people—both in and outside of this Room, not simply noble Lords.
The noble Baroness, Lady Buscombe, mentioned a pilot but I seem to remember that some of the outcomes on equality in that pilot got lost in translation. Perhaps it would be good to find out exactly what the pilot did and did not reveal—that is, not just the things that the department would like to reveal but some of the things that were not tested.
I do not doubt the personal integrity of the Minister in the slightest but I am unsure about the idea that the “test and learn” approach has no boundaries around it in the Bill. It is like saying, “Trust us. We test and learn, and all those powers exist”. With that, I will withdraw my stand part notice on Clause 128, but we have quite a lot of questions still to answer in our discussions on the next group of amendments and beyond.
My Lords, I will also speak to the other amendments in my name, which are designed to dig further into exactly what the Government plan to do with these powers. Amendments 220 to 222 are probing amendments which seek to establish what would happen if the powers to give account information notices were used only where there is suspicion that benefits are not being paid as the law intends. I will try to use this to find out exactly what will happen with the signal that the noble Lord, Lord Clement-Jones, has been referring to.
My Lords, I intervene very briefly. I thank my noble friend who, with her usual forensic clarity, identified some really important points. The last one in particular is very worrying. I have a question. It may be that I misheard what the Minister said in response to the last set of amendments. I thought I heard him say that child benefit would not be included, but it appears to have been on the list that was given to my noble friend. Of course, the point is partly that it is administered by HMRC, but it has replaced child tax allowances, so it should be treated in the same way as a tax allowance when it comes to this purpose—so I hope that I heard the Minister correctly and that child benefit will not be included.
My Lords, in relation to the excellent speech of the noble Baroness, she mentioned “personal” accounts. I would like to double-check that business accounts, charitable accounts and other accounts that have one’s name or one’s partner’s name on, or are connected, do not go on ad infinitum.
Because of the way the amendments are grouped, I have the opportunity to repeat my questions. The first one is relatively straightforward. Does the Minister accept that introducing these provisions—obviously we are talking about Amendment 234 on pensions—will discourage people from claiming pension credit? Despite all the efforts of the Government to encourage people to claim pension credit, clearly this will discourage them. Have the Government made any effort to estimate what impact this will have? Obviously, it is a very difficult task, but have they thought about it and does the Minister accept that it will have a deterrent effect.
My second question relates to the issue I have already raised. The state pension or state pension equivalent is paid by the state, by a pension fund or by a personal pension provider. Does the Minister think it odd that there is a difference in treatment? Everyone is receiving their pension from the state, but with a person who receives their pension from a private pension scheme or personal pension provider there is not the same right to look at their bank accounts in relation to those benefits. Now I am not advocating that as a solution. The question is: does this not indicate the illogicality and extent of the Government’s powers over some people’s incomes that they do not have over other types of income? To me, particularly when it comes to the payment of a pension—a benefit paid as of right—this discontinuity points to the extent of the Government’s overreach.
My Lords, I must begin by joining the general applause for the characteristic tour de force from the noble Baroness, Lady Sherlock. I was having a flashback because it was the noble Baroness in debate on what is now the Pension Schemes Act 2021 who taught me how to cope with Committee stage very kindly a long time ago —and we are very used to that. I rise briefly to address this group, but I start by saying in relation to the last group that I entirely agree with the proposition that Clause 128 should not stand part: the spying clause should not be part of the Bill.
I have a couple of points to make on the amendments in this group, one of which was raised by the noble Lord, Lord Clement-Jones, on the last group and is about protecting the Government from themselves. The amendments put down by the noble Baroness, Lady Sherlock, are probing. However, if we were to restrict the Government’s use of these powers, they might end up at a vaguely manageable scale. It is worth raising that point when we look at these groups.
My Lords, I was not intending to speak on this group, but another question occurs to me. We have been assuming throughout this that we are talking about requests of information to banks, but the Bill actually says that:
“The Secretary of State may give an account information notice to a person of a prescribed description”.
Could the Minister explain what that is?
My Lords, I would of course much prefer Clause 128 not to stand part, but we were just privileged by a master class from the noble Baroness, Lady Sherlock. She talked about these being probing amendments, but I do not think that I have seen a schedule so expertly sliced and diced before. If those are probing, they are pretty lethal. I agree with so many of those elements. If we are to have provisions, those are the kinds of additions that we would want and the questions that we would want to ask about them. I very much hope that the Minister has lots of answers, especially for the noble Baroness, Lady Sherlock, but also for the other noble lords who have spoken.
My Lords, the debate on this group has focused largely on the amendments from the noble Baroness, Lady Sherlock, regarding using powers only where there is a suspicion of fraud, making provisions so that information collected can be used only for the narrow purpose of determining overpayment, removing pension-age benefits from the scope of the powers and requiring approval from Parliament before the power can be used on specific working-age benefits.
I was going to go over the reason behind these measures once again, but I will not delay the Committee on why we are bringing them forward. I believe I did that at some length in the previous group, so I am going to turn to the amendments raised.
Narrowing these powers as suggested by the noble Baroness, with Amendments 220, 221, 222 and 222A, will leave us exposed to those who are deliberately aiming to defraud the welfare system and undermine the policy intent of this measure. In fact, taken together, these amendments would render the power unworkable and ineffective.
To restrict the power to cases where DWP already has a suspicion of fraud, as suggested by the noble Baroness, would defeat the purpose of this measure. The intent is to enable us to use data from third parties to independently check that benefit eligibility rules are being complied with. We use data from other sources to do this already. For example, we use data from HMRC to verify earnings in UC and check that the benefit eligibility rules are being complied with. Parliament has determined that, to be eligible for a benefit, certain rules and requirements must be met, and the Government have a responsibility to ensure that taxpayers’ money is spent responsibly. Therefore, the DWP should be able to utilise information from third parties to discharge that duty. This is an appropriate and proportionate response to a significant fraud and error challenge.
The noble Baroness, Lady Sherlock, also proposed that the power should be restricted such that it would not apply to persons who hold an account into which a benefit is paid on behalf of someone who cannot manage their own financial affairs—such persons are referred to as “appointees”. An appointee is a person who may be appointed by the Secretary of State to act on behalf of the benefit customer. Usually, the appointee becomes legally responsible for acting on the customer’s behalf in all matters related to the claim. It is also made clear to the appointee, in the documents that they sign, that we may get information about them or the person they are acting for from other parties, or for any other purposes that the law allows, to check the information they provide.
Under our proposed legislation, it is right to say that there may be some people who are not themselves benefit claimants but who have given a person permission to pay benefits into their bank account, who may be picked up in the data returned by third parties. Under the noble Baroness’s amendment, we would not be able to gather data on appointees, which would make the power unworkable, because third parties would not be able to distinguish between an individual managing their own benefit and an appointee. It also assumes that no fraud or error can occur in these cases, which is definitely wrong. I assure the noble Baroness that we handle such cases regularly and have robust existing processes for identifying appointees on our own database and for carefully handling cases of this nature.
The noble Baroness would also like to see the power—
Rather than asking all my questions at the end—I only have four—I will try to get answers as we go. On the appointees, I think that the Minister has just said that the reason the Government need these powers is that some appointees will have their benefit money paid into their own account, not into a separate second account, so that therefore needs to be the case. I am very happy to reword this amendment to make that clear. I was talking specifically about the linking arrangements; the amendment does not talk about excluding appointee accounts. It specifically says that accounts that are linked to an account into which the benefit is paid are not there. I am happy to reframe that in a way that defines it—I am sure we can find a way around this—but does the Minister accept the principle behind this: that, if there is a separate account that, say, I hold for a child who is there, this should not give a reason to look into my own accounts? Or is he saying that the Government want to look into my own accounts, or business accounts, or family accounts as well? Which is it?
The Government do wish to have that power. I should make it clear that an appointee could be a claimant as well, so there is a dual issue. It is important that we retain that power, to be sure that we cover the whole ground. But I will reflect on the noble Baroness’s point.
There were a number of questions on the other group that related specifically to people’s willingness to take these roles on and what the unintended consequence of putting appointees and carers in this position might be for the DWP, with people saying, “Actually, not me, then”.
The noble Baroness makes a very good point. I may be able to give her further reassurances in a letter because, on the one hand, we do want the power to be able to cover the ground. On the other hand, there are necessary protections that we must put in place. So further reassurances probably need to be given. There is that balance to be struck, but I hope I can continue to do that.
If I may pursue this, I am not sure I heard the Minister’s answer to the question of the noble Baroness, Lady Kidron—or maybe I did. If it was a charitable bank account, a business account or anything else, I think the Minister said that it would be subject to that scrutiny as well. Once someone acts for a carer, all of their bank accounts could be scrutinised—surely that is ridiculously unfair.
I am not sure I agree with that. I hope I can reassure the noble Baroness, as I tried to on the previous group. Using our test and learn process, which is already under way working closely with the banks, bringing them along with us and them bringing us along with them—there is a good relationship there—we are working through these important matters.
The point made by the noble Baroness, Lady Kidron, is important, as is that of the noble Baroness, Lady Jones. Again, it is important to give those reassurances. They will be forthcoming, and that is all part of our test and learn process, which I hope provides some reassurance.
I want to be absolutely clear on this point, because I am still not totally sure I am—I raised this the first time around on the last group. If I, as a landlord, have been paid rent as housing benefit directly, my accounts are caught. If I am a trustee of a charity and a cosignatory on a bank account, is the Minister saying that that charity’s account will be caught or not? I want to be absolutely crystal clear on that.
This is part of the filtering discussions that are already taking place at the moment.
Under the terms of the Bill, would this allow that to be caught?
Yes it would. Landlords are in scope. We will filter this through in terms of the business as usual. If we receive any information—
Given that, has the department done an assessment of the likely impact on landlords being willing to take people on housing benefit? It is already an issue that landlords are reluctant to take housing benefit recipients, but, with this, I could see the market completely freezing for people on benefit.
I clearly cannot go far enough today, but, because this is important and we are in Committee, I need to give some further reassurances on where we are in the process in terms of filtering. If I may conclude my remarks, I will finish this particular point. This is all part of the test and learn, and I give some reassurance that we are working through these important issues in relation to appointees and landlords.
It is precisely as the noble Baroness, Lady Kidron, said on the last group—this is a massive net. It feels as though this is so experimental that there is no certainty about how it will operate, and the powers are so broad that anything could be subject to it. It sounds extremely dangerous, and it is no wonder that everybody is so concerned.
I do not agree with that. We have done quite a lot of business together across the Chamber. That is a slightly sweeping issue, because I have given some reassurance that we are already working with the third parties to make sure that we have robust processes in place. For instance, when we are talking about landlords, while it is possible that a landlord’s account may be matched under the measure, only minimum information will be provided by the third parties to enable my department to identify an individual within our own database. With all the data received, we will make further inquiries only where appropriate and where the information is relevant to the benefit claim. This is already part of our business-as-usual processes.
My Lords, I am sorry to interrupt the Minister but, throughout these two groups, he has, in a sense, introduced wholly new concepts. We have “test and learn”, “filtering”—which sounds extraordinary—and “signals” but none seem to be in the black letter of the schedule, nor in the rest of the Bill. We have a set of intentions and we are meant to trust what the DWP is doing with these powers. Does the Minister not recognise that the Committee is clearly concerned about this? It needs tying down, whether we need to start from scratch and get rid of the clause or take on board the amendments put forward by the noble Baroness, Lady Sherlock. The uncertainty around this is massive.
My Lords, I ask the Minister for clarification. The noble Baroness, Lady Sherlock, asked about the number of individuals; I guess it may be 24 million or 25 million. However, from what the Minister has said, the number of bank accounts subject to surveillance would be far greater than that. For example, I receive a state pension and am also a trustee of a small not-for-profit organisation; from what the Minister said, I would be caught, as would that organisation. Landlords and many others could possibly be added. It seems that the number of bank accounts would be far greater than the number of individuals. When he provides the data, can the Minister estimate how many bank accounts and transactions there might be?
I will add to that the issue of overseas bank accounts. I cannot see how the British Government can apply this measure to them. Will this not push people to go to overseas bank accounts? Or will the Government try to pursue them through challenger banks—including multiple accounts from one person who may have one original, normal current account here?
How many accounts of “signalling” already exist in the current backlog in the business-as-usual version? What kind of investment will it take when you supercharge these powers and get many more tens of thousands of signals?
I will add to the Minister’s grief. He has talked a number of times about the limited information that will be provided to the DWP, but that is not what the Bill says. The Bill refers to
“such further information in connection with those accounts as may be specified”.
There is no limitation in the Bill to the information that the DWP can request from the bank—assuming that it is a bank, after my previous question. I am struggling to understand how we get from that to “limited”.
Right. A number of questions have been asked. I am not sure that I can give too much more clarity—only that I will go back to what I said on the first group in terms of the limited nature of what we are trying to do. I was very clear about its limited nature, I think.
This leads on to the numbers that noble Lords are asking me about. Of course, I cannot give that figure, as we do not honestly know it. Until we move forward on bringing the measure in, we will not know it. What is certain is that we need this power to be able to gain the limited data that we need. When we receive the data, it may be the case that we need to follow up. I am sure that we will not need to follow up in the vast majority of cases but we must have this power.
To the noble Lord, Lord Vaux, I say this: this measure is for UK accounts only. I hope that that is also helpful to the noble Baroness, Lady Bennett.
This is the problem. We have been talking about limited information, a limited nature and the limited things that we will look at, but that is not what the Bill says. We need to think seriously about how we should limit the rights in the Bill to match the requirements of the DWP. At the moment, there seems to be a huge gap.
That point is very much noted. I will certainly take it back. Clearly, we need to provide greater reassurance on the limits and scope, as well as on what we are trying to do. I regret that I am not able to give those answers in full to the Committee now but I hope that, today, I have already taken us further forward than we were before we started. That is quite an important point to make.
I shall touch on the benefits that are in scope of this measure, a point that was raised by the noble Baroness, Lady Sherlock. I think the noble Baroness wishes to restrict the power to working-age benefits, but pension-age benefits are not immune to fraud and error—I wanted to address that—and it is our duty to ensure that these benefits are paid correctly and in line with the benefit eligibility rules that Parliament has previously agreed. Every payment that the DWP makes has eligibility criteria to it. Parliament has considered these criteria in the passage of the relevant social security legislation, and the Government have a responsibility to check that payments are being made in line with those rules so that taxpayers’ money is spent responsibly.
Pension benefits other than pension credit have eligibility criteria attached, but I do not know any eligibility criteria applying to pensions that you could discover from someone’s bank account.
The example that the noble Lord will be aware of links to what the noble Lord, Lord Sikka, was saying about some pensioners who have moved abroad but, for whatever reason, have not told us that they have done so and continue to receive the uprating. The figure for the fraud aspect—or it could be error—linked to state pensions is £100 million.
Presumably the DWP already knows the address of the bank account to which an overseas pension is being paid. Why does it need to know any more?
My understanding is that it needs to have these powers to be able to cover the ground properly. I say again that these powers are limited, and whatever comes from the data that is requested from the third parties will end up being, we hope, limited. Even then, it may not be used by us because there is no need to do so.
The power covers all relevant benefits, grants and other payments set out in paragraph 16 of new Schedule 3B to the Social Security Administration Act 1992, as inserted by Schedule 11 to the Bill. To remove pension-age payments from the scope of the power would significantly undermine our power to tackle fraud and error where it occurs. Pension-age payments are not immune to fraud and error, as I have mentioned. I will give an example of that. The noble Baroness, Lady Sherlock, asked whether people would be notified of their bank accounts being accessed.
Before the Minister moves on, I asked specifically about child benefit. Could he please answer that?
I know that I said earlier that child benefit was not included. I will clarify that child benefit is not a benefit for which the DWP is responsible or has any functionality for. This measure will be exercised by the DWP Secretary of State, and we cannot use this power for that benefit.
I was in the middle of answering a question from the noble Baroness, Lady Sherlock.
I will finish this answer, if I may. The DWP personal information charter lists banks and financial institutions, and other parties, among the parties with which DWP may share data and from which we may receive data. It also lists checking accuracy and preventing and detecting fraud among the purposes for which we may share or receive information.
A claimant will not be notified if their account details have been returned to DWP by a third party as that could alert fraudsters to the criteria, enabling them to evade detection—I think that is a valid point—but they will be notified if a DWP agent determines that a review is required as a result of the information provided by the third party. That notification will be done through the business-as-usual processes.
Moving on to defining working-age payments in legislation, which relates to the final amendment in this group, Amendment 235, which was tabled by the noble Baroness, Lady Sherlock, it would require the Government to specify in regulations the working-age benefits with which this power could be used. As she demonstrated, there is a wide range of benefits and therefore potential avenues for fraudsters to seek or exploit or for error to creep in. That is why it is important that the power enables the department to respond proactively as new fraud risks emerge.
That said, as the noble Baroness knows, the power will not be exercisable in all the benefits she listed—I took note of her long list—such as child benefit, which we have just mentioned, because the legislation is drafted in such a way that it could reasonably be exercised in relation to benefits for which the Secretary of State is responsible. I reassure the noble Baroness, Lady Sherlock, and the Committee that in the first instance, we plan to use this with universal credit, employment and support allowance—ESA, pension credit and housing benefit. That is the way forward.
There may be a number of questions that I have not addressed, but I hope that I have continued to make the case for why this measure is so important and our aim to tackle fraud and error. I continue to make the case that it is proportionate and that proportionate safeguards are in place. With that, I hope the noble Baroness will agree to withdraw her amendment.
Will people with power of attorney over the account of someone who receives a benefit also be caught up in all this? That is another vulnerable group, so this could be extensive and quite worrying. Secondly, I am concerned by the Minister’s answers on this group. They have made me feel somewhat more strongly than I did when giving my response on the previous group, so I feel I should put that on the record.
That is understood. I know that I need to provide further reassurances. Attorneys are included for the reasons that I set out for appointees.
My Lords, I thank the Minister for taking the time to try to answer the questions. I know that we have given him a hard time, but I thank him for responding so graciously.
He did not take the opportunity to explain the process simply to the Committee. It may be that it is too difficult to explain simply or that, in fact, he can explain what they intend to do, but the powers allow them to do something much wider than that. It would be helpful if he could reflect before he writes as to how best to frame this. I think I heard him trying to say to the Committee that people think that more information is being handed over than will in fact be handed over. If that is the case, it would be helpful if he could spell that out because that would at least begin to help people understand better what is going on.
Secondly, in responding to me, the Minister focused, understandably, on the content of the amendments. I was trying to explain that the reason they are probing is that it is quite hard to get a handle on this. It is a big, sprawly thing, and I am trying to find a way of nailing some jelly to the table; I am trying to find ways of containing it. I still do not know which benefits the Government can use the powers over and which ones they intend to. It is a great step forward to know where they are going to start; that is really helpful. I am also grateful for the clarity, whether people are happy or not, that the Government intend to use the powers on the state pension and make that clear because that was not the impression given in the House of Commons when the matter was debated there. That is a helpful piece of clarity for the Committee and the wider community.
I know this is hard; fraud is difficult. A case was mentioned where an organised fraud gang stole more than £50 million in social security benefits. I know it is hard, and I know it is hard for the DWP to understand precisely where these things will lead when you begin to go there. I understand that if it is too boxed in, it makes it difficult to be able to follow where the fraudsters go, who are often one step ahead of the Government. I get all of that, but there is a risk that when it has spread so widely, the level of concern gets to the point that it will not be as publicly acceptable as the Minister thinks it is. I ask him to take the opportunity, when he goes back to the department, to talk to colleagues and think about what kind of assurances the Government could try to find a way of giving to people, either staging processes or government oversight. I ask him to think about that because the kinds of concerns he has heard here will only increase as the powers start to unfold.
In the next group of amendments, which I think will now be discussed on Wednesday, I want to dig further into the question of who the data and account notice can be given to and what criteria will be used. That will be another chance to flush out some things, so I give notice now that I would like the Minister to look into those areas next. I am grateful for his efforts and to all Members of the Committee who have explored this matter. I beg leave to withdraw my amendment.
(7 months, 1 week ago)
Grand CommitteeMy Lords, in moving Amendment 225, I will speak to the other amendments in this group. They cover two issues: first, the code of practice, which features in Part 2 of new Schedule 3B, inserted by the Bill into the Social Security Administration Act 1992. Paragraph 6(1) of new Schedule 3B says:
“The Secretary of State may issue a code of practice in connection with account information notices”.
Amendment 225 would change “may” to “must”. Paragraph 6(2) mentions some matters that a code “may” include and Amendment 226 would change that “may” to “must”.
Amendment 227 would ensure that a code of practice includes the criteria to be used by the Secretary of State in determining whether to issue account information notices—I will come back to criteria shortly. Amendment 230 would require the Government to consult on the draft code of conduct with consultees including the Social Security Advisory Committee and organisations that would have to comply with account information notices. Amendment 231 would require the code of practice and any revisions to it to be approved by both Houses of Parliament. The Secretary of State would still be able to withdraw a code of practice, but the ability to issue notices would lapse if no code were in force. Amendments 228, 229 and 232 are consequential.
The other matter covered in this group is how the Government report to Parliament on these notices. Amendment 233 amends new Schedule 3B to provide for annual reporting to Parliament on the use of account information notices. As well as requiring the provision of statistics around the use of such notices during the previous financial year, the amendment would compel the Secretary of State to outline his or her views on the proportionality and effectiveness of notices. I hope that the need for these amendments is self-evident. Ministers are proposing to take new powers of astonishing breadth, which will involve the ability to search the bank accounts of tens of millions of our citizens, most of whom will have done nothing wrong. There is still very little detail about how these powers could be, or will be, used.
I will address two particular sets of issues. The first is criteria. Paragraph 2 of new Schedule 3B explains that banks have to return information about matching accounts. As well as specifying the identity of the account holders, they have to meet certain risk criteria. The Bill, the Explanatory Memorandum and briefings always talk in terms of examples of those criteria, usually around capital limits or time abroad. But my understanding, which may be wrong—I invite the Minister to correct me if I am—is that the criteria could be anything related to eligibility for the benefits in question.
For example, the eligibility for some benefits includes being a single parent. Paragraph 2(2)(a) of new Schedule 3B says that an account information notice
“may require information relating to a person who holds a matching account even if the person does not claim a relevant benefit”.
On our last day in Committee, we established that that directly related to appointees, but that made me wonder whether it could apply to anybody else. For example, we also established that a notice could cover a joint account where one of the holders is the person to whom the benefit is paid and the other is not. Would this power allow DWP to ask banks to search for any accounts linked to any single parent and to examine those accounts for evidence that they and the other holder of a joint account might be living together? Would these powers allow DWP to devise any criteria designed to identify whether a claimant was living with another adult? To be clear, I am not asking whether it intends to do that or whether it knows how to do that. I am just asking whether it would be permissible. Is this a category of thing that it could do under the powers in the Bill?
Related to that, could DWP issue notices to a bank other than that into which the benefit is paid? Again, we have heard that the intention is to go only to the bank into which the benefit is paid, but I want to know specifically: does this Bill gives DWP the power to do that or would it need additional primary legislation to do it?
Secondly, the Bill does not say that notices can be given only to banks. It says that they can be given only to a “person of prescribed description”. The Information Commissioner said:
“I have been unable to identify where such persons are prescribed and the provision itself is silent on the matter”.
It is therefore unclear which organisations will be in scope of the power or how this will be determined. Can the Minister tell us any more about who will be covered and how that will be determined? Who could be subject to a notice? A bank or a building society could be, clearly, but could a credit union, a Christmas club savings scheme or any other financial body?
Paragraph 58 of the impact assessment on this part of the Bill says:
“This measure is drafted broadly to ensure it is future-proofed against future changes and innovation, particularly in the financial services sector, i.e. in Fintech and Crypto, and enable DWP to apply this measure to non-financial organisations in future if it is deemed appropriate and proportionate”.
Can the Minister give the Committee an example of a non-financial organisation that could be appropriate? Specifically, could this apply to, for example, phone companies? Given the open-ended nature of the powers being taken, one way for Ministers to give reassurance to both the Committee and the wider public would be to ensure that DWP is constrained by a clear and transparent code of practice over which Parliament has oversight and that it reports to Parliament on the way it is using these powers. If the Minister does not like the approach in this amendment, perhaps he could offer the Committee other forms of assurance in this area. I beg to move.
My Lords, I apologise to the Committee that duties elsewhere in the House prevented me from attending the last two debates on Monday and so from speaking to the amendments that I had tabled and signed. However, I have read the Official Report with care.
I cannot pretend to be a data protection nerd, or even a social security nerd, like some speakers in those debates, but I hope that I pass muster as a surveillance nerd, having written for the Home Secretary two of the reports that informed the Investigatory Powers Act 2016 and, more recently, a report that informed the Investigatory Powers (Amendment) Bill, which I see is to be given Royal Assent tomorrow.
I support all the amendments in the name of the noble Baroness, Lady Sherlock, in this group. Of course there must be a code of practice. Of course it must be consulted on and scrutinised. I would add that that of course we could not contemplate passing this schedule into law until we have seen and studied it. An annual report of the sort that accompanies the reasonable suspicion power to issue financial institution notices, exercised by HMRC under Schedule 36 to the Finance Act 2008, would also be useful. For example, it is from the last of those reports, dated January 2024, that I learned that these reasonable suspicion tax information powers were now being used to obtain location data—something that it had previously been said would not be done.
Dan Squires, one of the authors of the legal opinion that I know was referred to on Monday, is not only a King’s Counsel but a deputy High Court judge and a genuine expert in this area. He and his junior, Aidan Wills, point in that opinion to the personal nature of some of the data that could be harvested under the proposed power and advise that Schedule 11 does not come close to the safeguards required for compliance with Article 8. They refer in particular to the striking lack of clarity about the grounds on which and the circumstances in which the proposed power can be used, as well as to the absence of both independent authorisation and independent oversight. They point out that, although saving up to £600 million over five years is a very important objective, it weighs no more heavily—indeed, probably less heavily—than the normal justifications for obtaining information in bulk: protecting national security and the prevention and detection of serious crime. Their opinion is well referenced, persuasive and consistent with the view on proportionality expressed by both the Information Commissioner and the Constitution Committee, on which I sit.
On Monday, the Minister referred to the power in Schedule 23 to the Finance Act 2011 to obtain certain data items from particular classes of data holder—for example, employers and land agents. So I had a look at that schedule and the data-gathering regulations under its paragraph 1. The power would appear to apply only to certain tightly defined items, such as payments made by the employer or arising from use of land. There would appear to be a noticeable contrast with location data, personal spending habits and so on, which fall within the scope of the powers in this schedule, as they are written in the Bill. Both HMRC and the Home Office operate under powers tightly defined in legislation. Assurances that those powers will be used in a restrained way, as Justice has commented in its useful briefing on the Bill, simply do not cut it. I am afraid that the law requires the DWP to be subject to the same constraints.
I am concerned: concerned that this important new power was not subject to detailed consultation or even to scrutiny by a Commons Bill Committee, where useful evidence could have been heard; concerned that it could even have been contemplated that so vague a power might be in the Bill and not accompanied by a code of practice; concerned about the absence of an independent approval and oversight mechanism, equivalent to the Office for Communications Data Authorisations and the Investigatory Powers Commissioner’s Office; and concerned that, if we do not get this potentially valuable power right from the start, it will immediately be subject to legal challenges, which will swiftly render it unusable.
If, as I believe, Schedule 11 is currently unfit for purpose, is there time to rescue it? I have a couple of practical suggestions. First, I saw the investigatory powers unit from the Home Office when it happened to be in the House yesterday, and I wondered if there might be utility in it comparing notes with the Bill team about these types of powers and their attendant safeguards.
Secondly, I hope the Government appreciate the significance—at least to us nerds in the Committee—of the legal analysis of Dan Squires KC and Aidan Wills. If we are to be told that it is mistaken, which would certainly be unusual, I for one would like to see that backed up by an opinion from a lawyer of equivalent stature, whether at the GLD or independent counsel, explaining precisely and persuasively why Mr Squires and Mr Wills are wrong. Otherwise, and without significant change of the type identified in the opinion, I am afraid I am not inclined to give this schedule the benefit of the doubt.
I signed up to the stand part notice of the noble Baroness, Lady Kidron, thinking it would at least be a platform to think about what amendments to the schedule might be needed. The more I read the schedule and the more I hear about it, the more I am driven to the conclusion that, if we do not see substantial change, opposing the schedule may be the way that we have to go at the next stage.
In the two previous groups, I raised pension credit, and it is notable that the noble Viscount the Minister has not responded on that point. As such, my automatic assumption is that he believes that the implementation of these powers will deter people from seeking pension credit, which is contrary to the Government’s declared policy to encourage people. I mention that in passing, given this opportunity.
My other moan is about the impact assessment; there is none. I do not like the impact assessment that we have. It is a totally impenetrable and meaningless document, which is clearly there just as a matter of form rather than as a serious attempt to try to inform participants in these debates about what is in the Bill and what impact it will have on people and organisations.
My specific points are broadly in line with the points raised by UK Finance, the overall organisation for financial organisations, including banks and insurance companies, which continues to have serious concerns about these provisions. I think we should listen carefully to what it says. In particular, if we are going to have these powers then, in line with the amendments tabled by my noble friend Lady Sherlock, we have to make sure that they are introduced in an effective way that appreciates the vulnerabilities of customers.
My Lords, it has been a privilege to be at the ringside during these three groups. I think the noble Baroness, Lady Sherlock, is well ahead on points and that, when we last left the Minister, he was on the ropes, so I hope that to avoid the knock- out he comes up with some pretty good responses today, especially as we have been lucky enough to have the pleasure of reading Hansard between the second and third groups. I think the best phrase that noble Baroness had was the “astonishing breadth” of Clause 128 and Schedule 11 that we explored with horror last time. I very much support what she says.
The current provisions seem to make the code non-mandatory, yet we discovered they are without “reasonable suspicion”, the words that are in the national security legislation—fancy having the Home Office as our model in these circumstances. Does that not put the DWP to shame? If we have to base best practice on the Home Office, we are in deep trouble.
That aside, we talked about “filtering” and “signals” last time. The Minister used that phrase twice, I think, and we discovered about “test and learn”. Will all that be included in the code?
All this points to the fragility and breadth of this schedule. It has been dreamt up in an extraordinarily expansive way without considering all the points that the noble Lord, Lord Anderson, has mentioned, including the KC’s opinion, all of which point to the fact that this schedule is going to infringe Article 8 of the European Convention on Human Rights. I hope the Minister comes up with some pretty good arguments.
My final question relates to the impact assessment–or non-impact assessment. The Minister talked about the estimate of DWP fraud, which is £6.4 billion. What does the DWP estimate it will be after these powers are implemented, if they are ever implemented? Should we not have an idea of the DWP’s ambitions in this respect?
My Lords, this has been a somewhat shorter debate than we have been used to, bearing in mind Monday’s experience. As with the first two groups debated then, many contributions have been made today and I will of course aim to answer as many questions as I can. I should say that, on this group, the Committee is primarily focusing on the amendments brought forward by the noble Baroness, Lady Sherlock, and I will certainly do my very best to answer her questions.
From the debate that we have had on this measure, I believe that there is agreement in the Committee that we must do more to clamp down on benefit fraud. That is surely something on which we can agree. In 2022-23, £8.3 billion was overpaid due to fraud and error in the benefit system. We must tackle fraud and error and ensure that benefits are paid to those genuinely entitled to the help. These powers are key to ensuring that we can do this.
I will start by answering a question raised by the noble Lord, Lord Anderson—I welcome him to the Committee for the first time today. He described himself as a “surveillance nerd”, but perhaps I can entreat him to rename himself a “data-gathering nerd”. As I said on Monday, this is not a surveillance power and suggesting that it is simply causes unnecessary worry. This is a power that enables better data gathering; it is not a surveillance or investigation power.
The third-party data measure does not allow the DWP to see how claimants spend their money, nor does it give the DWP access to millions of people’s bank accounts, as has been inaccurately presented. When the DWP examines the data that it receives from third parties, this data may suggest that there is fraud or error and require a further review. This will be done through our normal, regular, business-as-usual processes to determine whether incorrect payments are indeed being made. This approach is not new. As alluded to in this debate, through the Finance Act 2011, Parliament has already determined that this type of power is proportionate and appropriate, as HMRC already owns similar powers regarding banking institutions and third parties in relation to all taxpayers.
I listened very carefully to the noble Lord and will, however, take back his points and refer again to our own legal team. I think the point was made about the legality of all this. It is a very important point that he has made with all his experience, and I will take it back and reflect on it.
I take the Minister’s point and I will settle for the appellation “investigatory powers nerd”; I am quite happy with that. Does the Minister agree with me, however, that the legal difficulty —we see this with the other bulk powers already in our law—is that Article 8 of the European convention locks in not when a human eye gets stuck into the detail, but as soon as a machine harvests the data in bulk? Most of that data relates to people in respect of whom there could be no possible suspicion. Satisfying the requirements of necessity and proportionality must be done even at that stage. I understand that that is awkward and I am sure a lot of people would prefer that it was otherwise, but that is, as I understand it, the law. That renders the distinction that the Minister seeks to draw between data gathering and surveillance perhaps slightly difficult to maintain.
If I may just answer that question from the noble Lord, Lord Anderson; I think it is important to take one question at a time.
I have every sympathy with what the noble Lord has said. As I mentioned on Monday, points could easily raised about that—I think it may have been the noble Baroness, Lady Kidron, who raised points about computers and their robustness. This is the very point that we agree with. It is incredibly important and we have started already to draw up a proper code of practice to work with the banks on how this will actually work. We need continued time to work these issues through. I also made the point on Monday that, at the end of the day, a human being will be there—must be there—to determine where we go from there.
In relation to the code of practice, which I am glad the Minister mentioned, we have just seen the Investigatory Powers (Amendment) Bill through this place. It makes some relatively minor changes to the powers of the intelligence agencies to harvest data in bulk and, to ensure the orderly passage of that Bill through both Houses of Parliament, the key excerpts of the draft code of practice were made available before Committee in either House to enable it to be properly scrutinised. We seem to have left it terribly late in the day still to be talking about a draft code of practice on this Bill, which we have not even seen. Can the Minister assure us that before we come to Report, that code of practice will be available in draft?
Indeed, I was going to come on to that later in my remarks, particularly to address the points raised by the noble Baroness, Lady Sherlock. We need the necessary time to continue to develop this code of practice, and that is particularly important in respect of this measure. The answer is no, I cannot guarantee to have the code of practice ready by Report. Indeed, I am saying that it will be ready sometime in the summer. It is important to make that point but also a further one, which is that there are many instances, as the noble Lord will know, when a code of practice is finalised and brought forward after the primary legislation is brought through, and this is one of those cases. That is not abnormal but normal. The noble Lord may not like it but there is considerable precedent for that to happen.
I have a question. I am slightly puzzled about the difference between data collection and surveillance. Surely the collection and gathering of data would be to enable officials to survey someone’s bank account. If that is not the case, what is the purpose of collecting the data if not to interrogate the behaviour of an individual to understand how their money is being brought in and spent, so that the department can exercise some judgment over whether the individual is revealing the truth about their income and outgoings?
Indeed, I think we are going back to the debates that we had on Monday. However, this chimes with a question from the noble Lord, Lord Clement-Jones, so it might be helpful briefly to rehearse what we are doing here and to be clear about the limitations and the checks and balances on the power that we are bringing forward.
As per paragraph 1(2) of Schedule 11 to the draft legislation, the DWP can use this power only for the purposes of checking whether someone is eligible for the benefit that they are receiving. In practice, this means that the DWP will request information only on specific criteria, which I laid out on Monday, linked to benefit eligibility rules, which, if met may—I emphasise “may”—indicate fraud or error. If accounts do not match these criteria, no data will be shared with the DWP. The effect of paragraphs 1 and 2 of the draft legislation is that the DWP can ask for data only where there is this three-way relationship between the DWP, the third party and the recipient of the payment. In addition, the DWP can ask for data only from third parties designated in secondary legislation, subject to the affirmative procedure. There are debates to come as further reassurance to your Lordships.
As per paragraph 4(2) of Schedule 11 to the draft legislation, the power does not allow the DWP to share personal information with third parties, which means that the power can be used only with third parties who are able to identify benefit recipients independently. Just to add further to this, we are obliged, under Article 5(1)(c) of the UK GDPR, to ask only for the minimum of information to serve our purposes. In accordance with the DWP’s existing commitments on the use of automation, no automatic benefit decisions will be taken based on any information supplied by third parties to the DWP. As I said earlier and on Monday, a human will always be involved in decision-making. I hope that helps.
I am sorry to interrupt the noble Viscount, but I just want to be clear about what he is saying in relation to the code of practice, which obviously is at the heart of this section of the debate, although there will be other things to come. Am I right that he said—obviously he has to cover himself—that there is a chance that the Report stage of this Bill might be entered into before we have sight of the draft code of practice? He makes the point that that is not an unusual occasion. I understand that—we have both served in Parliament long enough to know that that is the case—but this is clearly an issue on which the Committee has made very strong representations to the Government. Will he do what is in his power to make sure that we do not enter Report without seeing at least an early draft, if that is possible, of the code of practice?
I will certainly take that back. I do not want to make any commitments today. I have already set out our stall as to where we are. I make the further point—I am perhaps repeating myself—that given the sensitivities that there clearly are, which I have been listening to carefully, it is important that this code of practice is developed at a pace that is right for what is needed, in bringing those involved along and making sure that it is right, secure, safe and with all the safeguards involved. It is quite a serious piece of work, as noble Lords would expect me to say. I will take that back. I will certainly not be able to guarantee to produce anything before Report, which may disappoint the noble Lord, but at least I have gone as far as I can. I hope that that is helpful.
I am grateful to the noble Viscount. This is just a thought, but we are happy to help, as we often have done in the past on other Bills. If there is any opportunity for us to be shown early drafts, to give some help and assurance to the noble Viscount that he is on the right track, I am sure that that would be accepted.
I appreciate the tone of the noble Lord and, if there is anything that comes from behind me before I conclude my remarks, to be helpful, I will certainly do that.
Our debates on this measure have covered many issues. This group, as mentioned earlier, focuses primarily on the operational delivery of the power, so it would be quite good to move on. Just before I do, for the benefit of the noble Lord, Lord Anderson, in terms of the late introduction—his words—of this measure, as mentioned on Monday the DWP published a fraud plan in May 2022, where it outlined a number of new powers that it would seek to secure when parliamentary time allowed. In the parliamentary time available, DWP has prioritised our key third-party data-gathering measure, which will help it to tackle one of the largest causes of fraud and error in the welfare system. That is a short version of what I said on Monday, but I hope that it might be helpful.
Before I turn to the amendments, it might be helpful to set out how the legislation will frame the delivery of this measure. When we issue a request for data to a third party or, as it is set out, an account information notice or AIN, which is in the Bill, we can only ask it to provide data where it may help the DWP to establish whether benefits have been properly paid in accordance with the rules relating to those benefits. As mentioned earlier, this is defined clearly at paragraph 1(2) of the new schedule. This is where the data that DWP receives may signal—to use the word raised by the noble Lord, Lord Clement-Jones—potential fraud and error. The noble Lord asked for further clarification on that point. To be clear, a signal of fraud and error is where the rules of benefit eligibility appear not to be met. For example, this might be where a claimant has more capital than the benefit rules allow. As I made clear on Monday, all benefits and payments have rules that determine eligibility, which Parliament has agreed are the right rules in its consideration of other social security legislation. To issue an AIN, we must also have designated a third party in affirmative regulations, which need to be passed by both Houses.
As has been covered, we can also only request data from third parties where there is this relationship, which I will not repeat again and which I think the Committee will be familiar with. Our intention is to designate banks and financial institutions as the first third parties that we can approach, enabling us to request information on accounts only held in the UK. Just to clarify that point, we will not be able to request information on overseas accounts.
On the question raised by the noble Baroness, Lady Sherlock, on examples of non-financial organisations that the power could appropriately be used on, we will bring forward regulations to specify the data holders in scope. I hope that this is helpful. In the first instance, this will be, as mentioned, banks and financial institutions. The power also has potential use cases with other third parties, such as housing or childcare providers, but, just to reassure the Committee, this would be subject to further parliamentary approval.
I am grateful to the Minister—I am just trying to catch up. On the point that he made about regulations, I imagine that the power to prescribe the descriptions of persons to whom an account information notice may be sent comes under paragraph 1(1) of the schedule. I think that that is what he was saying. In paragraph 2, on the content of the account information notices, there is a reference to
“other specified information relating to the holders of those accounts, and … such further information in connection with those accounts as may be specified”.
Does that simply mean anything specified in the account information notice or is there a power to make regulations that will limit the types of information that can be specified in an AIN?
Again, I hope that I might have covered this earlier. If I read the noble Lord’s question correctly, the definitions will need to be debated by both Houses. I have made clear what we are bringing in at the moment for banks and financial institutions, but this will need to be looked at by both Houses in future. I hope that that is clear.
I apologise; I did not make myself clear. I think that we are on entirely the same wavelength on the persons to whom an information notice can be given; the Minister has reassured us that they will be specified in regulations and considered by both Houses. My question relates to the content of an account information notice under paragraph 2 and the very broad references to “other specified information”, “such further information” and so on. I did not read that as a regulation-making power. I rather assume that the discretion over the choice of information that is specified remains entirely at large. If the Minister is saying that there will be regulations that will specify the information that an AIN can include, hence mitigating the breadth of paragraph 2, I would be glad if he could make that clear.
My understanding —with his experience, I am sure that the noble Lord will be ahead of me on this—is that this is defined. We define it pretty clearly in paragraph 1(2). In the interests of time, I will reflect on what he has asked and will be absolutely sure to add this to the letter that I pledged to write on Monday—it is getting bigger by the moment, as I fully expected.
My Lords, as I asked only four questions, I want to try to nail each one as we go. I am grateful to the Minister. Before we leave the matter of the kind of organisations to which this applies, I think that he is saying that the Bill would allow the DWP to request information from any kind of organisation, including phone companies, which I asked about specifically. The kinds of organisations are to be specified in regulations, which the Government will bring forward, initially naming financial institutions. By virtue of further regulations, could they extend that to anything—to Garmin, the people who monitor your runs, to gyms and to anyone else? Is that correct?
That is correct. I hope indeed that it provides some reassurance that extending it to the banks and financial institutions initially is deliberately designed to be narrow. It would be subject to both Houses to debate other areas beyond those. I am coming on to address that. The noble Baroness asked about phone companies. Simply put, we will be able to designate the third parties that fit within the provisions of this legislation where they hold information that would help us to verify whether someone meets the eligibility criteria for the benefit that they are receiving. However, ultimately, it would be for Parliament to decide whether a third party can be designated under this power, as we must bring affirmative regulations forward to do this. We have that power.
To be clear, they already have some information about claimants or recipients. Does this Bill make any difference to that information? Can they already use the information that they have for these purposes, for example the name and address of a claimant’s bank account, or does this Bill extend the use of information to other information that they already have?
Indeed, that is correct. I hope that is helpful and gives the noble Lord reassurance. To clarify, we have our normal business-as-usual processes so, where we are able to—with the restriction of not at present being able to use the banks and financial institutions as a conduit—we have those powers. However, obviously, as has been made clear by the ICO, there is no alternative to needing the help of banks and financial institutions to go further in tackling the ever-greater sophistication of fraud.
The noble Baroness, Lady Sherlock, asked whether we could issue an AIN to a bank other than that into which the benefit is paid. The answer is no. The power is exercisable only in respect of a matching account that meets the criteria in an AIN and receives a benefit payment. If this is not the case, the Secretary of State cannot require them to supply that information.
When it comes to issuing an AIN, DWP will be able to exercise these powers only for payments for which it is responsible. This means that DWP cannot exercise this power with some benefits that fall under the legislation, such as child benefit, as was mentioned on Monday. I know that the noble Baroness, Lady Sherlock, raised this issue. As I committed to do on Monday, I will provide in writing more detail on the scope of the measure and on these limitations, which will require more time.
I will also ensure that my letter is clear on how the measure will impact appointees, joint claims and other such accounts. I am well aware that a number of questions were asked about this matter on Monday but, in the interests of time, I will move on.
I turn to proofs of concept. I also want to speak about our approach to delivery, in particular how we plan to test delivery before we gradually scale up operational delivery; I am aware of the time, but I hope that the Committee will indulge me. Our planned period of “test and learn” will build on our learning from our two previous proofs of concept, which we conducted in 2017 and 2022. These demonstrated the effectiveness of this approach and contributed to the OBR’s certification that the measure will save up to £600 million over the next five years.
The two proofs of concept that I mention are important. I hope that the Committee will be interested to read the results, which demonstrate why we need to do this. Without further ado, let me say that I will set out the details of these two examples in the letter as well, which will, I hope, be helpful.
The noble Lord, Lord Vaux, who is in his place, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Sherlock, spoke about the regulatory impact assessment on Monday. I just want to use this time to reassure them on that. More information on these proofs are contained within the RIA, which was, as noble Lords will know, green-rated by the RPC.
On “test and learn”, we have a clear view on how this power may work. We are already working with third parties in readiness to commence the formal “test and learn” period in early 2025 and preparing the code of practice in advance of that. I will come on to that in just a second—in fact, I will come on to it right now, given the time. I shall refer to Amendments 225 to 232 in the name of the noble Baroness, Lady Sherlock.
To support the delivery of this measure, we will produce the code of practice to help define how the measure will work, with explanations. I assure the noble Baroness and the Committee that the code of practice is already in development; we are working positively with around eight leading financial institutions through an established working group that meets regularly to shape the code. We are fully committed to continuing that work; I think I covered the timing of that earlier in my remarks. Accepting Amendments 225 and 226 in the name of the noble Baroness would therefore, we believe, have minimal effect. I am clear that DWP will produce a code of practice, which will be consulted on; I have also set out the sort of detail that it will contain. Accepting them may also potentially restrict our ability to develop the code of practice further as we understand more from “test and learn”.
Because we are developing this collaboratively with banks, I am not yet in a position to share the draft code, as I mentioned; I have given certain reassurances on that. However, I can say that it will provide guidance on issues such as the nature of the power and to whom it will apply. It will also provide information on safeguards, cover data security responsibilities and provide information on the appeals processes should a third party wish to dispute a request. We will engage with SSAC, to help the noble Baroness, Lady Sherlock, as we bring forward the affirmative regulations. On balance, I believe that the best course is to consult on the code of practice rather than rushing to define it now.
I am most grateful to the Minister. There is one question, so I apologise if he answered it and I did not quite pick it up. I specifically asked if these powers would allow the DWP to devise criteria designed to identify if a claimant was in fact living with another adult. With the appropriate regulation, would the powers allow it to do that?
That is one of the questions that I can now answer. The power will allow this, in so far as it pertains to helping the Secretary of State establish whether the benefits are being paid properly, as with paragraph 1(2) of new Schedule 3B. Rules around living together are relevant only to some benefits. That is a very short answer, but I could expand on it.
May I add to the very long letter? I have been sitting here worrying about this idea that one of the “signals” will be excess capital and then there are matching accounts. If the matching account has more capital—for example, the person who has a connected account is breaking the £16,000 or £6,000—does that signal trigger some sort of investigation?
That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.
My Lords, the Minister extolled the green-rated nature of this impact assessment. In the midst of all that, did he answer my question?
I asked about the amount of fraud that the Government plan to detect, on top of the £6.4 billion in welfare overpayments that was detected last year.
The figure that we have is £600 million but, again, I will reflect on the actual question that we are looking to address—the actual amount of fraud in the system.
The Minister is saying that that figure is not to be found in this green-rated impact assessment, which most of us find to be completely opaque.
I will certainly take that back, but it is green rated.
My Lords, we have talked about proportionality and disproportionality throughout the debate on this Bill. Is it not extraordinary that that figure is not on the table, given the extent of these powers?
My Lords, the Minister was kind enough to mention me a little earlier. Can I just follow up on that? In the impact assessment, which I have here, nowhere can I find the £600 million figure, nor can I find anywhere the costs related to this. There will be a burden on the banks and clearly quite a burden on the DWP, actually, if it has got to trawl through this information, as the noble Viscount says, using people rather than machines. The costs are going to be enormous to save, it would appear, up to £120 million per year out of £6.4 billion per year of fraud. It does seem odd. It would be really helpful to have those cost numbers and to understand in what document they are, because I cannot find in the impact assessment where these numbers are.
I hope I can help both noble Lords. Although I must admit that I have not read every single page, I understand that the figure of £500 million is in the IA.
Yes, £500 million. I mentioned £600 million altogether; that was mentioned by the OBR, which had certified this, and by the way, that figure was in the Autumn Statement.
My Lords, has not that demonstrated the disproportionality of these measures?
The noble Viscount explained in response to the noble Lord, Lord Anderson, that at every stage where the powers are going to be expanded, it would come back as an affirmative regulation. I might have been a bit slow about this, but I have been having a look and I cannot see where it says that. Perhaps he could point that out to me, because that would provide some reassurance that each stage of this is coming back to us.
I understand, very quickly, that it is in paragraph 1(1), but again, in the interests of time, maybe we could talk about that outside the Room.
Could the Minister clarify: was that paragraph 1(1)?
I can reassure the noble Lord that that is the case, yes.
I do not know whether I can help. I agree with the noble Baroness: I do not think it is very clear from paragraph 1(1) that there is a regulation-making power. However, if you look at paragraph 5 of the new schedule, there is a reference there to regulations under paragraph 1(1) as well as two other paragraphs of the schedule. That is the rather tortuous route by which I came to the conclusion that the Minister is quite right.
I reassure noble Lords that is correct—it is paragraph 1(1). It may be rather complex, but it is in there, just to reassure all noble Lords.
I am sorry to keep coming back, but did the Minister give us the paragraph in the impact assessment that referred to £500 million?
No, I did not, but that is something which surely we can deal with outside the Room. However, I can assure noble Lords that it is in there.
My Lords, I thank the Minister for his attempts to answer my questions and those of many noble Lords. I will not detain the Committee for very long at all.
I am grateful to know that there will be a code and that it will be consulted on. Given that, it would have saved an awful lot of trouble if the Government had simply not put “may” in the Bill in the first place—that would have cut out a whole loop of this. I am very grateful to know that that is there. I agree with the Minister that we all want to know about and to clamp down on fraud and error; the question is one of proportionality.
When the Minister comes to write—I realise that this letter is turning into “War and Peace”, but it will make us all come to Report in a much better place if we can get a clearer answer to many of these questions— I still wonder whether he properly answered the question from the noble Lord, Lord Anderson, about the legality of these powers, because the point about when they engage is crucial. The Minister is still coming back to a distinction between the gathering of the data and what the DWP will do using its existing “business as usual” powers, to investigate. I think the point the noble Lord was making is that the question of legality engages at the point of that data gathering, not at the point at which it is used, if I am correct. I am not sure that the Minister answered that—I am not inviting him to do it now—but I specifically suggest that he takes advice on that point before we come back on Report.
The other issue is that, if the Government have come in so late in the day introducing these powers into the Bill, it would have been better to have draft regulations before Report at the first stage. The Minister thinks the code can be available in the summer, but the summer is fast approaching so I see no reason why the usual channels could not accommodate the date for Report to allow us to go past the date for producing a draft code if the Government wish to. I realise that they may not wish to, but it must be perfectly possible—unless the Minister knows something I do not about a likely date of a general election, presumably we should still have time to do that. So I commend that thought to him.
However, we also know that a lot of the constraints he has described will happen solely in regulations. Everybody in this Committee is aware of the limitations of the capacity of both Houses to do anything about regulations. We cannot amend them here. The Government will bring them forward, but the capacity of us to do anything about that is small, so that is not as much of an assurance as it would be in other circumstances.
Finally, what I am left with is that these powers could do anything from something that might sound very proportionate to something that might sound entirely disproportionate, and we simply have not heard anything that enables us to make a judgment early enough to know where that is contained. I therefore ask the Government to think again before Report about ways in which they might provide assurance about a more contained and proportionate approach to these measures.
Since we are in Committee, in the meantime, I thank all noble Lords for their work on this and the Minister for his response. Before I beg leave to withdraw, I see that the Minister is intervening on me now, which is a joyful change.
Before the noble Baroness sits down, I want to say one very important thing. As ever with Bills, there is an opportunity to engage, and I pledge right now to engage with all noble Lords who wish to, and we would like to as well, on these particular measures, to provide, I hope, further reassurances to those that I have given. I hope there is some acceptance that I have given some reassurances.
My Lords, I am sure that on behalf of the Committee I can thank the Minister for that generous offer, and we look forward to taking it up. In the meantime, I beg leave to withdraw the amendment.
My Lords, having listened carefully to representations from across the House at Second Reading, I am introducing this amendment to address concerns about the data preservation powers established in the Bill. The amendment provides for coroners, and procurators fiscal in Scotland, to initiate the data preservation process when they decide it is necessary and appropriate to support their investigations into a child’s death, irrespective of the suspected cause of death.
This amendment demonstrates our commitment to ensuring that coroners and procurators fiscal can access the online data they may need to support their investigation into a child’s death. It is important to emphasise that coroners and procurators fiscal, as independent judges, have discretion about whether to trigger the data preservation process. We are grateful to the families, Peers and coroners whom we spoke to in developing these measures. In particular, I thank the noble Baroness, Lady Kidron, who is in her place. I beg to move.
My Lords, it is an unusual pleasure to support the Minister and to say that this is a very welcome amendment to address a terrible error of judgment made when the Government first added the measure to the Bill in the other place and excluded data access for coroners in respect of children who died by means other than suicide. I shall not replay here the reasons why it was wrong, but I am extremely glad that the Government have put it right. I wish to take this opportunity to pay tribute to those past and present at 5Rights and the NSPCC for their support and to those journalists who understood why data access for coroners is a central plank of online safety.
I too recognise the role of the Bereaved Families for Online Safety. They bear the pain of losing a child and, as their testimony has repeatedly attested, not knowing the circumstances surrounding that death is a particularly cruel revictimisation for families, who never lose their grief but simply learn to live with it. We owe them a debt of gratitude for putting their grief to work for the benefit of other families and other children.
My Lords, I want briefly to contribute to this debate, which I think is somewhat less contentious than the previous group of amendments. As somebody, again, who was working on the Online Safety Act all the way through, I really just pay tribute to the tenacity of the noble Baroness, Lady Kidron, for pursuing this detail—it is a really important detail. We otherwise risk, having passed the legislation, ending up in scenarios where everyone would know that it was correct for the data-gathering powers to be implemented but, just because of the wording of the law, they would not kick in when it was necessary. I therefore really want to thank the noble Baroness, Lady Kidron, for being persistent with it, and I congratulate the Government on recognising that, when there is an irresistible force, it is better to be a movable object than an immovable one.
I credit the noble Viscount the Minister for tabling these amendments today. As I say, I think that this is something that can pass more quickly because there is broad agreement around the Committee that this is necessary. It will not take away the pain of families who are in those circumstances, but it will certainly help coroners get to the truth when a tragic incident has occurred, whatever the nature of that tragic incident.
My Lords, having been involved in and seen the campaigning of the bereaved families and the noble Baroness, Lady Kidron, in particular in the Joint Committee on the Draft Online Safety Bill onwards, I associate myself entirely with the noble Baroness’s statement and with my noble friend Lord Allan’s remarks.
My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.
Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words
“or are due to conduct an investigation”
are indeed superfluous.
We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.
There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.
Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.
Let me begin by reiterating my thanks to the noble Baroness, Peers, families and coroners for their help in developing these measures. My momentary pleasure in being supported on these amendments is, of course, tempered by the desperate sadness of the situations that they are designed to address.
I acknowledge the powerful advocacy that has taken place on this issue. I am glad that we have been able to address the concerns with the amendment to the Online Safety Act, which takes a zero-tolerance approach to protecting children by making sure that the buck stops with social media platforms for the content they host. I sincerely hope that this demonstrates our commitment to ensuring that coroners can fully access the online data needed to provide answers for grieving families.
On the point raised by the noble Baroness, Lady Kidron, guidance from the Chief Coroner is likely to be necessary to ensure both that this provision works effectively and that coroners feel supported in their decisions on whether to trigger the data preservation process. Decisions on how and when to issue guidance are a matter for the Chief Coroner, of course, but we understand that he is very likely to issue guidance to coroners on this matter. His office is working with my department and Ofcom to ensure that our processes are aligned. The Government will also work with the regulators and interested parties to see whether any guidance is required to support parents in understanding the data preservation process. Needless to say, I would be more than happy to arrange a meeting with the noble Baroness to discuss the development of the guidance; other Members may wish to join that as well.
Once again, I thank noble Lords for their support on this matter.
My Lords, I rise to move Amendment 239 and to speak to Amendment 250 in my name. I am grateful to the right reverend Prelate the Bishop of London and the noble Lord, Lord Clement-Jones, for their support for Amendment 250.
These amendments tackle the sensitive but vital process of registering births and deaths. We are pleased that, in Clauses 133 to 137, the Government have set about modernising the Births and Deaths Registration Act 1953. The legislation created a huge paper trail of registrations, with local registrars being required to hold paper copies of every live birth, stillbirth and death, as well as providing certified paper copies of the register entries. Since 2009, registrars have also recorded this information electronically, so there is a huge duplication of effort. The clauses now proposed allow registrars to decide the best form in which to record this information, with an expectation that we will largely move to an online database.
These proposals make sense and will be widely welcomed. They make the functioning of the registrar more efficient. More importantly, they will make it easier for families, particularly those that have been bereaved, to inform authorities at what is often a difficult and distressing time. However, we believe that the Bill could go even further to simplify the process, tackle fraud and support bereaved families.
Our Amendment 239 would move away from individual registrars deciding how to record the information and would instead take the first steps to creating a single digital register of births and deaths. Our proposal is that the Secretary of State should commission a review to consider the viability of such a proposal and its potential impact on tackling fraud, the protection of personal data and whether such a scheme would simplify registration procedures on a national level. It would require the conclusions of the review to be laid before Parliament within six months of the section coming into law.
We believe that this standardisation would make it easier for law enforcement agencies to check whether identities are being stolen and whether patterns of identity theft are emerging. It would also enable regulators to set national standards as to how this information should be protected and accessed by, for example, those with commercial interests. It should also make it easier for individuals living in one part of the country to register a death in another part of the country. I hope that the Minister sees the sense of these modest proposals.
Amendment 250 addresses the further potential for the Tell Us Once service. This has been a welcome initiative, which enables bereaved families to inform a large number of government and public sector bodies that a death has occurred without repeating the details over and over again. This considerably reduces the administrative burden at a time of distress and complexity while dealing with the consequences of a bereavement. However, private organisations are not included and loved ones are still tasked with contacting organisations such as employers, banks, energy and telephone companies and so on. Inevitably, the response from these organisations is variable and can be unwittingly insensitive.
A number of charities, including Marie Curie, came together to establish the UK Commission on Bereavement, which was chaired by the right reverend Prelate the Bishop of London. Its 2022 report found that 61% of adult respondents had experienced practical challenges when notifying an organisation of the death of a loved one. The report made a number of recommendations, with the extension of Tell Us Once being a key issue raised. The report recommended a review of the scheme.
We believe that the time has come to roll out the benefits of the Tell Us Once scheme more widely, so we propose a review of the effectiveness of the current legislation, including any gaps in its provision. Recommendations should then be drawn up to assess whether the scheme could be expanded to include non-public sector, voluntary and private sector holders of personal data. Our proposal is that the Secretary of State should lay a report before Parliament within six months.
This is a common-sense set of proposals, which could bring positive benefits to bereaved families, making best use of digital services to ease the distress and pain of trying to manage a complex web of administrative tasks. I hope that noble Lords and the Minister will see the sense of these proposals and agree to take them forward. I beg to move.
My Lords, I will be brief because we very much support these amendments. Interestingly, Amendment 239 from the noble Baroness, Lady Jones, follows closely on from a Private Member’s Bill presented in November 2021 by the Minister’s colleague, Minister Saqib Bhatti, and before that by the right honourable Andrew Mitchell, who is also currently a Minister. The provenance of this is impeccable, so I hope that the Minister will accept Amendment 239 with alacrity.
We very much support Amendment 250. The UK Commission on Bereavement’s Bereavement is Everyone’s Business is a terrific report. We welcome Clause 133 but we think that improvements can be made. The amendment from the noble Baroness, which I have signed, will address two of the three recommendations that the report made on the Tell Us Once service. It said that there should be a review, which this amendment reflects. It also said that
“regulators must make sure bereaved customers are treated fairly and sensitively”
by developing minimum standards. We very much support that. It is fundamentally a useful service but, as the report shows, it can clearly be improved. I congratulate the noble Baroness, Lady Jones, on picking up the recommendations of the commission and putting them forward as amendments to this Bill.
My Lords, I declare an interest as someone who has been through the paper death registration process and grant of probate, which has something to do with why I am in your Lordships’ House, so I absolutely understand where the noble Baroness, Lady Jones of Whitchurch, is coming from. I thank her for tabling these amendments to Clauses 133 and 142. They would require the Secretary of State to commission a review with a view to creating a single digital register for the registration of births and deaths and to conduct a review of the Government’s Tell Us Once scheme.
Clause 133 reforms how births and deaths are registered in England and Wales by enabling a move from a paper-based system of birth and death registration to registration in a single electronic register. An electronic register is already in use alongside the paper registers and has been since 2009. Well-established safety and security measures and processes are already in place with regard to the electronic infrastructure, which have proven extremely secure in practice. I assure noble Lords that an impact assessment has been completed to consider all the impacts relating to the move to an electronic register, although it should be noted that marriages and civil partnerships are already registered electronically.
The strategic direction is to progressively reduce the reliance on paper and the amount of paper in use, as it is insecure and capable of being tampered with or forged. The creation of a single electronic register will remove the risk of registrars having to transmit loose-leaf register pages back to the register office when they are registering births and deaths at service points across the district. It will also minimise the risk of open paper registers being stolen from register offices.
The Covid-19 pandemic had unprecedented impacts on the delivery of registration services across England and Wales, and it highlighted the need to offer more choice in how births and deaths are registered in the future. The provisions in the Bill will allow for more flexibility in how births and deaths are registered—for example, registering deaths by telephone, as was the case during the pandemic. Over 1 million deaths were successfully registered under provisions in the Coronavirus Act 2020. This service was well received by the public, registrars and funeral services.
Measures will be put in place to ensure that the identity of an informant is established in line with Cabinet Office good practice guidance. This will ensure that information provided by informants can be verified or validated for the purposes of registering by telephone. For example, a medical certificate of cause of death issued by a registered medical practitioner would need to have been received by the registrar before an informant could register a death by telephone. Having to conduct a review, as was proposed by the noble Baroness, Lady Jones, would delay moving to digital ways of working and the benefits this would introduce.
Can I just be clear? The noble Lord was quite rightly saying that there is going to be a move to digital, rather than paper, and we all support that. However, our amendment went one stage further and said that there should be one national digital scheme. In the impact assessment and the strategic direction, to which the noble Lord referred, is one national scheme intended so that registrars do not have the flexibility to do their own thing, with their own computer? Is that now being proposed?
The noble Baroness asks a fair question. A major thing is being proposed, so it is best that we work with our DWP colleagues, and I commit to writing to the noble Baroness and the Committee on that point.
On the amendment to Clause 142, while we agree with the aim of improving the Tell Us Once service, our view is that the only way to achieve this is by upgrading its technology. This work is under way and expected to take up to two years to complete. It will ensure that Tell Us Once continues to operate into the future, providing us with the ability to build on opportunities to improve its speed and efficiency.
Going back to what I said earlier, it would not be right to commit to undertake a review of the service while this upgrading work is ongoing, especially as any extension of the service would require a fundamental change in how it operates, placing additional burdens on registrars and citizens, and undermining that simplicity-of-service principle. For those who still wish to use a paper process, that option will remain. For the reasons that I have set out, I am not able to accept these amendments and I hope that the noble Baroness is happy not to press them.
My Lords, I am grateful to hear that there is some work ongoing on the registrar process and that the noble Lord will write with further details. Obviously, if this work is already happening and we have the same intent, we would accept that our amendment is superfluous, but I need to be a little more assured that that is the case.
I was a bit more disappointed with what the Minister was saying on Tell Us Once. I suspect that the technology upgrade to which he referred is only for the current scheme, which refers only to the public sector. However, our proposal and the Marie Curie proposal, which was very well argued, is that there is now a need to extend that to the private sector—to banks, telephone companies and so on.
I did not really hear the Minister saying that that was going to be the case but, if he is going to write, maybe he could embrace that as well. As I said, Tell Us Once is a hugely popular scheme and if we can extend it further to a wider group of organisations, that would be a very popular thing for the Government to do.
In the meantime, I beg leave to withdraw the amendment.
My Lords, I now turn to the national underground asset register, which I will refer to as NUAR. It is a new digital map of buried pipes and cables that is revolutionising the way that we install, maintain, operate and repair our buried infrastructure. The provisions contained in the Bill will ensure workers have complete and up-to-date access to the data that they need, when they need it, through the new register. NUAR is estimated to deliver more than £400 million per year of economic growth through increased efficiency, reduced accidental damage and fewer disruptions for citizens and businesses. I am therefore introducing several government amendments, which are minor in nature and aim to improve the clarity of the Bill. I hope that the Committee will be content if I address these together.
Amendment 244 clarifies responsibilities in relation to the licensing of NUAR data. As NUAR includes data from across public and private sector organisations, it involves both Crown and third-party intellectual property rights, including database rights. This amendment clarifies that the role of the Keeper of the National Archives in determining the licence terms for Crown IP remains unchanged. This will require the Secretary of State to work through the National Archives to determine licence terms for Crown data, as was always intended. Amendments 243 and 245 are consequential to this change.
Similarly, Amendment 241 moves the provision relating to the first initial upload of data to the register under new Part 3A to make the Bill clearer, with Amendments 248 and 249 consequential to this change.
Amendment 242 is a minor and technical amendment that clarifies that regulations made under new Section 106B(1) can be made “for or in connection with”—rather than solely “in connection with”—the making of information kept in NUAR available, with or without a licence.
Amendment 247 is another minor and technical amendment to ensure that consistent language is used throughout Schedule 13 and so further improve the clarity of these provisions. These amendments provide clarity to the Bill; they do not change the underlying policy.
Although Amendment 298 is not solely focused on NUAR, this might perhaps be a convenient point for me to briefly explain it to your Lordships. Amendment 298 makes a minor and technical amendment to Clause 154, the clause which sets out the extent of the Bill. Subsection (4) of that clause currently provides that an amendment, repeal or revocation made by the Bill
“has the same extent as the enactment amended, repealed or revoked”.
Subsection (4) also makes clear that this approach is subject to subsection (3), which provides for certain provisions to extend only to England and Wales and Northern Ireland. Upon further reviewing the Bill, we have identified that subsection (4) should, of course, also be subject to subsection (2), which provides for certain provisions to extend only to England and Wales. Amendment 298 therefore makes provision to ensure that the various subsections of Clause 154 operate effectively together as a coherent package.
I now turn to a series of amendments raised by the noble Lord, Lord Clement-Jones. Amendments 240A and 240B relate to new Section 106A, which places a duty on the Secretary of State to keep a register of information relating to apparatus in streets in England and Wales. Section 106A allows for the Secretary of State to make regulations that establish the form and manner in which the register is kept. The Bill as currently drafted provides for these regulations to be subject to the negative procedure. Amendment 240A calls for this to be changed to the affirmative procedure, while Amendment 240B would require the publication of draft regulations, a call for evidence and the subsequent laying before Parliament of a statement by the Secretary of State before such regulations can be made.
My Lords, I thank the Minister for his exposition. He explained the purposes of Clauses 138 to 141 and extolled their virtues, and helpfully explained what my amendments are trying to do—not that he has shot any foxes in the process.
The purpose of my amendments is much more fundamental, and that is to question the methodology of the Government in all of this. The purpose of NUAR is to prevent accidental strikes where building works damage underground infrastructure. However, the Government seem to have ignored the fact that an equivalent service—LinesearchbeforeUdig, or LSBUD—already achieves these aims, is much more widely used than NUAR and is much more cost effective. The existing system has been in place for more than 20 years and now includes data from more than 150 asset owners. It is used by 270,000 UK digging contractors and individuals—and more every day. The fact is that, without further consultation and greater alignment with current industry best practice, NUAR risks becoming a white elephant, undermining the safe working practices that have kept critical national infrastructure in the UK safe for more than two decades.
However, the essence of these amendments is not to cancel NUAR but to get NUAR and the Government to work much more closely with the services that already exist and those who wish to help. They are designed to ensure that proper consultation and democratic scrutiny is conducted before NUAR is implemented in statutory form. Essentially, the industry says that NUAR could be made much better and much quicker if it worked more closely with the private sector services that already exist. Those who are already involved with LinesearchbeforeUdig say, first of all, that NUAR will create uncertainty and reduce safety, failing in its key aims.
The Government have been developing the NUAR since 2018. Claiming that it would drive a reduction in unexpected underground assets being damaged in roadworks, the impact assessment incorrectly states:
“No businesses currently provide a service that is the same or similar to the service that NUAR would provide”.
In fact, as I said, LSBUD has been providing a safe digging service in the UK for 20 years and has grown significantly over that time. Without a plan to work more closely with LSBUD as the key industry representative, NUAR risks creating more accidental strikes of key network infrastructure, increasing risks to workers safety through electrical fires, gas leaks, pollution and so on. The public at home or at work would also suffer more service outages and disruption.
Secondly, NUAR will add costs and stifle competition. The Government claim that NUAR will deliver significant benefits to taxpayers, reduce disruption and prevent damage to underground assets, but the impact assessment ignores the fact that NUAR’s core functions are already provided through the current system—so its expected benefits are vastly overstated. While asset owners, many of whom have not been consulted, will face costs of more than £200 million over the first 10 years, the wholesale publication of asset owners’ entire networks creates commercially sensitive risks, damaging innovation and competition. Combined with the uncertainties about how quickly NUAR can gain a critical mass of users and data, this again calls into question why NUAR does not properly align with and build on the current system but instead smothers competition and harms a successful, growing UK business.
Thirdly, NUAR risks undermining control over sensitive CNI data. Underground assets are integral to critical national infrastructure; protecting them is vital to the UK’s economic and national security. LSBUD deliberately keeps data separate and ensures that data owners remain in full control over who can access their data via a secure exchange platform. NUAR, however, in aiming to provide a single view of all assets, removes providers’ control over their own data—an essential security fail-safe. It would also expand opportunities for malicious actors to target sectors in a variety of ways—for instance, the theft of copper wires from telecom networks.
NUAR shifts control over data access to a centralised government body, with no clear plan for how the data is to be protected from unauthorised access, leading to serious concerns about security and theft. Safe digging is paramount; mandating NUAR will lead to uncertainty, present more health and safety dangers to workers and the public and put critical national infrastructure at risk. These plans require further review. There needs to be, as I have said, greater alignment with industry best practice. Without further consultation, NUAR risks becoming a white elephant that undermines safe digging in the UK and increases risk to infrastructure workers and the public.
I will not go through the amendments individually as the Minister has mentioned what their effect would be, but I will dispel a few myths. The Government have claimed that NUAR has the overwhelming support of asset owners. In the view of those who briefed me, that is not an accurate reflection of the broadband and telecoms sector in particular; a number of concerns from ISPA members have been raised with the NUAR team around cost and security that have yet to be addressed. This is borne out by the fact that there are notable gaps in the major asset owners in the telecoms sector signed up to NUAR at this time.
Clearly, the noble Viscount is resisting changing the procedure by which these changes are made from negative to affirmative, but I hope I have gone some way to persuade the Committee of the importance of this change to how the NUAR system is put on a statutory footing. He talked about a “handful” of data; the comprehensive nature of the existing system is pretty impressive, and it is a free service, updated on a regular basis, which covers more than 150 asset owners and 98% of high-risk assets. NUAR currently covers only one-third of asset owners. The comparisons are already not to the advantage of NUAR.
I hope the Government will at least, even if they do not agree with these amendments, think twice before proceeding at the speed they seem to be and without the consent or taking on board the concerns of those who are already heavily engaged with Linesearch- beforeUdig who find it pretty satisfactory for their purposes.
My Lords, the Minister really did big up this section of the Bill. He said it would revolutionise this information service, that it would bring many benefits, has a green rating, would be the Formula 1 of data transfer in mapping and so on. We were led to expect quite a lot from this part of the legislation. It is an important part of the Bill, because it signifies some government progress towards the goal of creating a comprehensive national underground asset register, as he put it, or NUAR. We are happy to support this objective, but we have concerns about the progress being made and the time it is taking.
To digress a bit here, it took me back 50 years to when I was a labourer working by the side of a bypass. One of the guys I was working with was operating our post hole borer; it penetrated the Anglian Water system and sent a geyser some 20 metres up into the sky, completely destroying my midday retreat to the local pub between the arduous exercise of digging holes. Had he had one of the services on offer, I suspect that we would not have been so detained. It was quite an entertaining incident, but it clearly showed the dangers of not having good mapping.
As I understand it, and as was outlined by the noble Lord, Lord Clement-Jones, since 2018 the Government have been moving towards this notion of somewhere recording what lies below the surface in our communities. We have had street works legislation going back several decades, from at least 1991. In general, progress towards better co-ordination of utilities excavations has not been helped by poor and low levels of mapping and knowledge of what and which utilities are located underground. This is despite the various legislative attempts to make that happen, most of which have attempted to bring better co-ordination of services.
I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.
While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.
It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.
NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—
I am concerned. We get the principle behind NUAR, but is there an interface between NUAR and this other service—which, on the face of it, looks quite extensive—currently in place? Is there a dialogue between the two? That seems to be quite important, given that there is some doubt over NUAR’s current scope.
I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.
My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a
“sustainably-funded UK success story”—
okay, give or take a bit of puff—that
“responds to most requests in 5 minutes or less”.
It has
“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.
That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.
As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.
My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?
Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.
Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.
I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.
All I can say is that, in that case, the Minister has been worked on extremely well.
In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.
My Lords, it took them half a day to discover where the hole had gone and what the damage was. The water flooded several main roads and there were traffic delays and the rest. So these things are very serious. I was trying to make a serious point while being slightly frivolous about it.
No, indeed, it is a deeply serious point. I do not know the number off the top of my head but there are a number of deaths every year as a result of these things.
As I was saying, a thorough impact assessment was undertaken for the NUAR measures, which received a green rating from the Regulatory Policy Committee. Impacts on organisations that help facilitate the exchange of data related to assets in the street were included in the modelling. Although NUAR could impact existing utility—
I cannot resist drawing the Minister’s attention to the story in today’s Financial Times, which reports that two major water companies do not know where their sewers are. So I think the impact is going to be a little bit greater than he is saying.
I saw that story. Obviously, regardless of how they report the data, if they do not know, they do not know. But my thought was that, if there are maps available for everything that is known, that tends to encourage people who do not know to take better control of the assets that they manage.
A discovery project is under way to potentially allow these organisations—these alternative providers—to access NUAR data; LSBUD has been referenced, among others. It attended the last three workshops we conducted on this, which I hope could enable it to adapt its services and business models potentially to mitigate any negative impacts. Such opportunities will be taken forward in future years should they be technically feasible, of value, in the public interest and in light of the views of stakeholders, including asset owners.
A national underground asset register depends on bringing data together from asset owners on to a single standardised database. This will allow data to be shared more efficiently than was possible before. Asset owners have existing processes that have been developed to allow them to manage risks associated with excavations. These processes will be developed in compliance with existing guidance in the form of HSG47. To achieve this, those working on NUAR are already working closely with relevant stakeholders as part of a dedicated adoption group. This will allow for a safe and planned rollout of NUAR to those who will benefit from it.
Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.
As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.
I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.
My Lords, Amendment 251 is also in the names of the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the noble Baroness, Lady Jones. I commend the noble Lord, Lord Arbuthnot, for his staunch support of the sub-postmasters over many years. I am grateful to him for adding his name to this amendment.
This amendment overturns a previous intervention in the law that has had and will continue to have far-reaching consequences if left in place: the notion that computer evidence should in law be presumed to be reliable. This error, made by the Government and the Law Commission at the turn of the century and reinforced by the courts over decades, has, as we now know, cost innocent people their reputations, their livelihoods and, in some cases, their lives.
Previously, Section 69 of the Police and Criminal Evidence Act 1984 required prosecutors in criminal cases relying on information from computers to confirm that the computer was operating correctly and could not have been tampered with before it submitted evidence. As the volume of evidence from computers increased, this requirement came to be viewed as burdensome.
In 1997, the Law Commission published a paper, Evidence in Criminal Proceedings: Hearsay and Related Topics, in which it concluded that Section 69
“fails to serve any useful purpose”.
As a result, it was repealed. The effect of this repeal was to create a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say, the computer is always right. In principle, there is a low threshold for rebutting this presumption but, in practice, as the Post Office prosecutions all too tragically show, a person challenging evidence derived from a computer will typically have no visibility of the system in question or the ways in which it could or did fail. As a result, they will not know what records of failures should be disclosed to them and might be asked for.
This situation was illustrated in the Post Office prosecution of sub-postmaster Mrs Seema Misra. Paul Marshall, Mrs Misra’s defence lawyer, describes how she was
“taunted by the prosecution for being unable to point to any … identifiable … problem”,
while they hid behind the presumption that the Horizon system was “reliable” under the law. On four occasions during her prosecution, Mrs Misra requested court order disclosure by the Post Office of Horizon error records. Three different judges dismissed her applications. Mrs Misra went to prison. She was eight weeks pregnant, and it was her son’s 10th birthday. On being sentenced, she collapsed.
The repeal of Section 69 of PACE 1984 reflects the Law Commission’s flawed belief that most computer errors were “down to the operator” or “apparent to the operator”, and that you could
“take as read that computer evidence is reliable unless a person can say otherwise”.
In the words of a colleague of mine from the University of Oxford, a professor of computing with a side consultancy specialising in finding bugs for global tech firms ahead of rollout, this assumption is “eye-wateringly mistaken”. He recently wrote to me and said:
“I have been asking fellow computer scientists for evidence that computers make mistakes, and have found that they are bewildered at the question since it is self-evident”.
There is an injustice in being told that a machine will always work as expected, and a further injustice in being told that the only way you can prove that it does not work is to ask by name for something that you do not know exists. That is to say, Mrs Misra did not have the magic word.
In discussions, the Government assert that the harm caused by Horizon was due to the egregious failures of corporate governance at the Post Office. That there has been a historic miscarriage of justice is beyond question, and the outcome is urgently awaited. But the actions of the Post Office were made possible in part because of a flaw in our legal and judicial processes. What happened at the Post Office is not an isolated incident but potentially the tip of an iceberg, where the safety of an unknown number of criminal convictions and civil judgments is called into question.
For example, the Educational Testing Service, an online test commissioned by the Home Office, wrongly determined that 97% of English language students were cheating, a determination that cost the students their right to stay in the UK and/or their ability to graduate, forfeiting thousands of pounds in student fees. The Guardian conducted interviews with dozens of the students, who described the painful consequences. One man was held in UK immigration detention centres for 11 months. Others described being forced into destitution, becoming homeless and reliant on food banks as they attempted to challenge the accusation. Others became depressed and suicidal when confronted with the wasted tuition fees and the difficulty of shaking off an allegation of dishonesty.
The widespread coverage of the Horizon scandal has made many victims of the Home Office scandal renew their efforts to clear their names and seek redress. In another case, at the Princess of Wales Hospital in 2012, nurses were wrongly accused of falsifying patient records because of discrepancies found with computer records. Some of the nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed, when it emerged that a visit by an engineer to fix a bug had eradicated all the data that the nurses were accused of failing to gather. That vital piece of information could easily have been discovered and disclosed, if computer evidence was not automatically deemed to be reliable.
My Lords, I congratulate the noble Baroness, Lady Kidron, on her amendment and thank her for allowing me to add my name to it. I agree with what she said. I, too, had the benefit of a meeting with the Lord Chancellor, which was most helpful. I am grateful to Mr Paul Marshall—whom the noble Baroness mentioned and who has represented several sub-postmasters in the Horizon scandal—for his help and advice in this matter.
My first short point is that evidence derived from a computer is hearsay. There is good reason for treating hearsay evidence with caution. Computer scientists know—although the general public do not—that only the smallest and least complex computer programs can be tested exhaustively. I am told that the limit for that testing is probably around 100 lines of a well-designed and carefully written program. Horizon, which Mr Justice Fraser said was not in the least robust, consisted of a suite of programs involving millions of lines of code. It will inevitably have contained thousands of errors because all computer programs do. Most computer errors do not routinely cause malfunctions. If they did, they would be spotted at an early stage and the program would be changed—but potentially with consequential changes to the program that might not be intended or spotted.
We are all aware of how frequently we are invited to accept software updates from our mobile telephone’s software manufacturers. Those updates are not limited to security chinks but are also required because bugs—or, as we learned yesterday from Paula Vennells’s husband, anomalies and exceptions—are inevitable in computer programs. That is why Fujitsu had an office dedicated not just to altering the sub-postmasters’ balances, shocking as that is, but to altering and amending a program that was never going to be perfect because no computer program is.
The only conclusion that one can draw from all this is that computer programs are, as the noble Baroness said, inherently unreliable, such that having a presumption in law that they are reliable is unsustainable. In the case of the DPP v McKeown and Jones—in 1997, I think—Lord Hoffmann said:
“It is notorious that one needs no expertise in electronics to be able to know whether a computer is working properly”.
One must always hesitate before questioning the wisdom of a man as clever as Lord Hoffmann, but he was wrong. The notoriety now attaches to his comment.
The consequences of the repeal of Section 69 of the Police and Criminal Evidence Act 1984 have been that it reduces the burden of proof, so that Seema Misra was sent to prison in the circumstances set out by the noble Baroness. Further, this matter is urgent for two reasons; they slightly conflict with each other, but I will nevertheless set them out. The first is that for the presumption to remain in place for one minute longer means that there is a genuine risk that miscarriages of justice will continue to occur in other non-Post Office cases, from as early as tomorrow. The second is that any defence lawyer will, in any event, be treating the presumption as having been fatally undermined by the Horizon issues. The presumption will therefore be questioned in every court where it might otherwise apply. It needs consideration by Parliament.
My noble friend the Minister will say, and he will be right, that the Horizon case was a disgraceful failure of disclosure by the Post Office. But it was permitted by the presumption of the correctness of computer evidence, which I hope we have shown is unsustainable. Part of the solution to the problem may lie in changes to disclosure and discovery, but we cannot permit a presumption that we know to be unfounded to continue in law.
My noble friend may also go on to say that our amendment is flawed in that it will place impossible burdens on prosecutors, requiring them to get constant certificates of proper working from Microsoft, Google, WhatsApp, and whatever Twitter is called nowadays. Again, he may be right. We do not seek to bring prosecutions grinding to a halt, nor do we seek to question the underlying integrity of our email or communications systems, so we may need another way through this problem. Luckily, my noble friend is a very clever man, and I look forward to hearing what he proposes.
My Lords, we have heard two extremely powerful speeches; I will follow in their wake but be very brief. For many years now, I campaigned on amending the Computer Misuse Act; the noble Lord, Lord Arbuthnot, did similarly. My motivation did not start with the Horizon scandal, but was more at large because of the underlying concerns about the nature of computer evidence.
I came rather late to this understanding about the presumption of the accuracy of computer evidence. It is somewhat horrifying, the more you look into the history of this, which has been so well set out by the noble Baroness, Lady Kidron. I remember advising MPs at the time about the Police and Criminal Evidence Act. I was not really aware of what the Law Commission had recommended in terms of getting rid of Section 69, or indeed what the Youth Justice and Criminal Evidence Act did in 1999, a year after I came into this House.
The noble Baroness has set out the history of it, and how badly wrong the Law Commission got this. She set out extremely well the impact and illustration of Mrs Misra’s case, the injustice that has resulted through the Horizon cases—indeed, not just through those cases, but through other areas—and the whole aspect of the reliability of computer evidence. Likewise, we must all pay tribute to the tireless campaigning of the noble Lord, Lord Arbuthnot. I thought it was really interesting how he described computer evidence as hearsay, because that essentially is what it is, and there is the whole issue of updates and bug fixing.
The one area that I am slightly uncertain about after listening to the debate and having read some of the background to this is precisely what impact Mr Justice Fraser’s judgment had. Some people seem to have taken it as simply saying that the computer evidence was unreliable, but that it was a one-off. It seems to me that it was much more sweeping than that and was really a rebuttal of the original view the Law Commission took on the reliability of computer evidence.
My Lords, I support this probing amendment, Amendment 251. I thank all noble Lords who have spoken. From this side of the Committee, I say how grateful we are to the noble Lord, Lord Arbuthnot, for all that he has done and continues to do in his campaign to find justice for those sub-postmasters who have been wronged by the system.
This amendment seeks to reinstate the substantive provisions of Section 69 of PACE, the Police and Criminal Evidence Act 1984, revoking this dangerous assumption. I would like to imagine that legislators in 1984 were perhaps alert to the warning in George Orwell’s novel Nineteen Eighty-Four, written some 40 years earlier, about relying on an apparently infallible but ultimately corruptible technological system to define the truth. The Horizon scandal is, of course, the most glaring example of the dangers of assuming that computers are always right. Sadly, as hundreds of sub-postmasters have known for years, and as the wider public have more recently become aware, computer systems can be horribly inaccurate.
However, the Horizon system is very primitive compared to some of the programs which now process billions of pieces of our sensitive data every day. The AI revolution, which has already begun, will exponentially accelerate the risk of compounded errors being multiplied. To take just one example, some noble Lords may be aware of the concept of AI hallucinations. This is a term used to describe when computer models make inaccurate predictions based on seeing incorrect patterns in data, which may be caused by incomplete, biased or simply poor-quality inputs. In an earlier debate, the noble Viscount, Lord Younger of Leckie, said that account information notices will be decided. How will these decisions be made? Will they be made by individual human beings or by some AI-configured algorithms? Can the Minister share with us how such decisions will be taken?
Humans can look at clouds in the sky or outlines on the hillside and see patterns that look like faces, animals or symbols, but ultimately we know that we are looking at water vapour or rock formations. Computer systems do not necessarily have this innate common sense—this reality check. Increasingly, we will depend on computer systems talking to each other without any human intervention. This will deliver some great efficiencies, but it could lead to greater injustices on a scale which would terrify even the most dystopian science fiction writers. The noble Baroness, Lady Kidron, has already shared with us some of the cases where a computer has made errors and people have been wronged.
Amendment 251 would reintroduce the opportunity for some healthy human scepticism by enabling the investigation of whether there are reasonable grounds for questioning information in documents produced by a computer. The digital world of 2024 depends more on computers than the world of Nineteen Eighty-Four in actual legislation or in an Orwellian fiction. Amendment 251 enables ordinary people to question whether our modern “Big Brother” artificial intelligence is telling the truth when he or it is watching us. I look forward to the Minister’s responses to all the various questions and on the current assumption in law that information provided by the computer is always accurate.
My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.
There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.
On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.
This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.
We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.
This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.
Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.
Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.
It may have already done so, but I will certainly pass that on.
I thank everyone who spoke and the Minister for the offer of a meeting alongside his colleagues from the MoJ. I believe he will have a very busy diary between Committee and Report, based on the number of meetings we have agreed to.
However, I want to be very clear here. We have all recognised that the story of the Post Office sub-postmasters makes this issue clear, but it is not about the sub-postmasters. I commend the Government for what they are doing. We await the inquiry with urgent interest, and I am sure I speak for everyone in wishing the sub-postmasters a fair settlement—that is not in question. What is in question is the fact that we do not have unlimited Lord Arbuthnots to be heroic about all the other things that are about to happen. I took it seriously when he said not one moment longer: it could be tomorrow.
My Lords, I am afraid that I will speak to every single one of the amendments in this group but one, which is in the name of the noble Baroness, Lady Jones, and I have signed it. We have already debated the Secretary of State’s powers in relation to what will be the commission, in setting strategic priorities for the commissioner under Clause 32 and recommending the adoption of the ICAO code of practice before it is submitted to Parliament for consideration under Clause 33:
“Codes of practice for processing personal data”.
We have also debated Clause 34:
“Codes of practice: panels and impact assessments”.
And we have debated Clause 35:
“Codes of Practice: Secretary of States recommendations”.
The Secretary of State has considerable power in relation to the new commission, and then on top of that Clause 143 and Schedule 15 to the Bill provide significant other powers for the Secretary of State to interfere with the objective and impartial functioning of the information commission by the appointment of non-executive members of the newly formed commission. The guarantee of the independence of the ICO is intended to ensure the effectiveness and reliability of its regulatory function and that the monitoring and enforcement of data protection laws are carried out objectively and free from partisan or extra-legal considerations.
These amendments would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission, in particular by modifying Schedule 15 to the Bill to transfer budget responsibility and the appointment process of the non-executive members of the information commission to the relevant Select Committee. If so amended, the Bill would ensure that the new information commission has sufficient arm’s-length distance from the Government to oversee public and private bodies’ uses of personal data with impartiality and objectivity. DSIT’s delegated powers memorandum to the DPRRC barely mentions any of these powers, yet they are of considerable importance. Therefore, I am not surprised that there was no mention of them, but they are very significant.
We have discussed data adequacy before; of course, in his letter to us, the Minister tried to rebut some of the points we made about it. In fact, he quoted somebody who has briefed me extensively on it and has taken a very different view to the one he alleges she took in a rather partial quotation from evidence taken by the European Affairs Committee, which is now conducting an inquiry into data adequacy and its implications for the UK-EU relationship. We were told by Open Rights Group attendees at a recent meeting with the European Commission that it expressed concern to those present about the risk that the Bill poses to the EU adequacy agreement; this was not under Chatham House rules. It expressed this risk in a meeting at which a number of UK groups were present, which is highly significant in itself.
I mentioned the European Affairs Committee’s inquiry. I understand that the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs has also given written evidence on its concerns about this Bill, its impact on adequacy and how it could impact the agreement. It put its arguments rather strongly. Has the Minister seen this? Is he aware of the written evidence that it has given to the European Affairs Select Committee? I suggest that he becomes aware of it and takes a view on whether we need to postpone Report until we have seen the European Affairs Select Committee’s report. If it comes to the conclusion that data adequacy is at risk, the Government will have to go back to the drawing board in a number of respects on this Bill. If the Select Committee report comes out and says that the impact of the Bill will not be data adequate, it would be rather foolish if we had already gone through Report by that time. Far be it from me not to want the Government to have egg on their face but it would be peculiar if they did not carefully observe the evidence being put to the European Affairs Select Committee and the progress that it is making in its inquiry. I beg to move.
My Lords, I thank the noble Lord, Lord Clement-Jones, for introducing his amendments so ably. When I read them, I had a strong sense of déjà vu as attempts by the Government to control the appointments and functioning of new regulators have been a common theme in other pieces of legislation that we have debated in the House and which we have always resisted. In my experience, this occurred most recently in the Government’s proposals for the Office for Environmental Protection, which was dealing with EU legislation being taken into by the UK and is effectively the environment regulator. We were able to get those proposals modified to limit the Secretary of State’s involvement; we should do so again here.
I very much welcome the noble Lord’s amendments, which give us a chance to assess what level of independence would be appropriate in this case. Schedule 15 covers the transition from the Information Commissioner’s Office to the appointment of the chair and non-executive members of the new information commission. We support this development in principle but it is crucial that the new arrangements strengthen rather than weaken the independence of the new commission.
The noble Lord’s amendments would rightly remove the rights of the Secretary of State to decide the number of non-executive members and to appoint them. Instead, his amendments propose that the chair of the relevant parliamentary committee should oversee appointments. Similarly, the amendments would remove the right of the Secretary of State to recommend the appointment and removal of the chair; again, this should be passed to the relevant parliamentary committee. We agree with these proposals, which would build in an additional tier of parliamentary oversight and help remove any suspicion that the Secretary of State is exercising unwarranted political pressure on the new commission.
The noble Lord’s amendments beg the question of what the relevant parliamentary committee might be. Although we are supportive of the wording as it stands, it is regrettable that we have not been able to make more progress on establishing a strong bicameral parliamentary committee to oversee the work of the information commission. However, in the absence of such a committee, we welcome the suggestion made in the noble Lord’s Amendment 256 that the Commons Science, Innovation and Technology Committee could fulfil that role.
Finally, we have tabled Amendment 259, which addresses what is commonly known as the “revolving door” whereby public sector staff switch to jobs in the private sector and end up working for industries that they were supposedly investigating and regulating previously. This leads to accusations of cronyism and corruption; whether or not there is any evidence of this, it brings the reputation of the whole sector into disrepute. Perhaps I should have declared an interest at the outset: I am a member of the Advisory Committee on Business Appointments and therefore have a ringside view of the scale of the revolving door taking place, particularly at the moment. We believe that it is time to put standards in public life back at the heart of public service; setting new standards on switching sides should be part of that. Our amendment would put a two-year ban on members of the information commission accepting employment from a business that was subject to enforcement action or acting for persons who are being investigated by the agency.
I hope that noble Lords will see the sense and importance of these amendments. I look forward to the Minister’s response.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for their amendments to Schedule 15 to the Bill, which sets out the governance structure of the new information commission.
The ICO governance reforms ensure its accountability to Parliament. Before I go any further, let me stress that the Government are committed to the ICO’s ongoing independence. We have worked closely with the Information Commissioner, who is supportive of the reforms, which they state allow the ICO
“to continue to operate as a trusted, fair and independent regulator”.
The Government’s view, therefore, is that this Bill is compatible with maintaining the free flow of personal data from Europe. These reforms have been designed carefully with appropriate safeguards in place to protect the information commission’s independence and ensure accountability before Parliament on important issues such as public appointments, money and accounts.
The Bill requires the Secretary of State to give the member a written statement of reasons for the removal and make public the decision to do so, ensuring accountability and transparency. This process is in line with standard practice for other UK regulators, such as Ofcom, which do not require parliamentary oversight for the removal of non-executives.
The chair can be removed only by His Majesty on an Address by both Houses, provided that the Secretary of State presents a report in Parliament stating that they are satisfied that there are serious grounds for removal, as set out in the Bill. This follows the process for the removal of the current Information Commissioner.
Greater performance measurement will help the ICO achieve its objectives and enable it to adjust its resources to prioritise key areas of work. This will also increase accountability to Parliament—a point raised by both noble Lords—organisations and the public, who have an interest in its effectiveness.
The Government are satisfied that these processes safeguard the integrity of the regulator, are in line with best practices for other regulators and, crucially, balance the importance of the information commission’s independence with appropriate oversight by the Government and Parliament as necessary. The regulator is, and remains, accountable to Parliament, not the Government, in its delivery of data protection regulation.
My Lords, I thank the Minister for his response, dusty though it may have been. The noble Baroness, Lady Jones, is absolutely right; this Government have form in all areas of regulation. In every area where we have had legislation related to a regulator coming down the track, the Government have taken more power on and diminished parliamentary oversight rather than enhancing it.
It is therefore a little rich to say that accountability to Parliament is the essence of all this. That is not the impression one gets reading the data protection Bill; the impression you get is that the Government are tightening the screw on the regulator. That was the case with Ofcom in the Online Safety Act; it is the case with the CMA; the noble Baroness, Lady Jones, mentioned her experience as regards the environment. Wherever you look, the Government are tightening their control over the regulators. It is something the Industry and Regulators Committee has been concerned about. We have tried to suggest various formulae. A Joint Committee of both Houses was proposed by the Communications and Digital Committee; it has been endorsed by a number of other committees, such as the Joint Committee on the Draft Online Safety Bill, and I think it has even been commended by the Industry and Regulators Committee as well in that respect.
We need to crack this one. On the issue of parliamentary accountability for the regulator and oversight, the balance is not currently right. That applies particularly in terms of appointments, in this case of the commissioner and the non-executives. The Minister very conveniently talked about removal but this could be about renewal of term, and it is certainly about appointment. So maybe the Minister was a little bit selective with the example he chose to say where the control was.
We are concerned about the independence of the regulator. The Minister did not give an answer, so I hope that he will write about whether he knows what the European Affairs Select Committee is up to. I made a bit of a case on that. Evidence is coming in, and the relevant committee in the European Parliament is giving evidence. The Minister, the noble Viscount, Lord Camrose, was guilty of this in a way, but the way that the data adequacy aspect is seen from this side of the North Sea seems rather selective. The Government need to try to try to put themselves in the position of the Commission and the Parliament on the other side of the North Sea and ask, “What do we think are the factors that will endanger our data adequacy as seen from that side?” The Government are being overly complacent in regarding it as “safe” once the Bill goes through.
It was very interesting to hear what the noble Baroness had to say about the revolving door issues. The notable thing about this amendment is how limited it is; it is not blanket. It would be entirely appropriate to have this in legislation, given the sensitivity of the roles that are carried out by senior people at the ICO.
However, I think we want to make more progress tonight, so I beg leave to withdraw my amendment.
My Lords, I rise somewhat reluctantly to speak to Amendment 291 in my name. It could hardly be more important or necessary, but I am reluctant because I really think that the Minister, alongside his colleagues in DSIT and the Home Office, should have taken this issue up. I am quite taken aback that, despite my repeated efforts with both of those departments, they have not done so.
The purpose of the amendment is simple. It is already illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables its creation—the files trained on or trained to create child sexual abuse material—is not. This amendment closes that gap.
Some time ago, I hosted an event at which members of OCCIT—the online child sexual exploitation and abuse covert intelligence team—gave a presentation to parliamentarians. For context, OCCIT is a law enforcement unit of the National Police Chiefs’ Council that uses covert police tactics to track down offender behaviour, with a view to identifying emerging risks in the form of new technologies, behaviours and environments. The presentation its officers gave concerned AI-generated abuse scenarios in virtual reality, and it was absolutely shattering for almost everyone who was present.
A few weeks later, the team contacted me and said that what it had showed then was already out of date. What it was now seeing was being supercharged by the ease with which criminals can train models that, when combined with general-purpose image-creation software, enable those with a sexual interest in children to generate CSAM images and videos at volume and—importantly—to order. Those building and distributing this software were operating with impunity, because current laws are insufficient to enable the police to take action against them.
In the scenarios that they are now facing, a picture of any child can be blended with existing child sexual abuse imagery, pornography or violent sexual scenarios. Images of several children can be honed into a fictitious child and used similarly or, as I will return to in a moment, a picture of an adult can be made to look younger and then used to create child sexual abuse. Among this catalogue of horrors are the made-to-order models trained using images of a child known to the perpetrator—a neighbour’s child or a family member—to create bespoke CSAM content. In short, the police were finding that the scale, sophistication and horror of violent child sexual abuse had hit a new level.
The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudophotographs of a child. AI content depicting child sexual abuse in the scenarios that I have just described is also illegal under the law, but creating and distributing the software models needed to generate them is not.
There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and being traded with impunity. These models blend images of children—known children, stock photos, images scraped from social media or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios.
My Lords, as ever, the noble Baroness, Lady Kidron, has nailed this issue. She has campaigned tirelessly in the field of child sexual abuse and has identified a major loophole.
What has been so important is learning from experience and seeing how these new generative AI models, which we have all been having to come to terms with them for the past 18 months, are so powerful in the hands of ordinary people who want to cause harm and sexual abuse. The important thing is that, under existing legislation, there are of course a number of provisions relating to creating deepfake child pornography, the circulation of pornographic deepfakes and so on. However, as the noble Baroness said, what the legislation does not do is go upstream to the AI system—the AI model itself—to make sure that those who develop those models are caught as well. That is what a lot of the discussion around deepfakes is about at the moment—it is, I would say, the most pressing issue—but it is also about trying to nail those AI system owners and users at the very outset, not waiting until something is circulated or, indeed, created in the first place. We need to get right up there at the outset.
I very much support what the noble Baroness said; I will reserve any other remarks for the next group of amendments.
My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.
As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.
Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.
As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.
Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.
I thank the noble Baroness, Lady Kidron, for tabling Amendment 291, which would create several new criminal offences relating to the use of AI to collect, collate and distribute child abuse images or to possess such images after they have been created. Nobody can dispute the intention behind this amendment.
We recognise the importance of this area. We will continue to assess whether and what new offences are needed to further bolster the legislation relating to child sexual abuse and AI, as part of our wider ongoing review of how our laws need to adapt to AI risks and opportunities. We need to get the answers to these complex questions right, and we need to ensure that we are equipping law enforcement with the capabilities and the powers needed to combat child sexual abuse. Perhaps, when I meet the noble Baroness, Lady Kidron, on the previous group, we can also discuss this important matter.
However, for now, I reassure noble Lords that any child sex abuse material, whether AI generated or not, is already illegal in the UK, as has been said. The criminal law is comprehensive with regard to the production and distribution of this material. For example, it is already an offence to produce, store or share any material that contains or depicts child sexual abuse, regardless of whether the material depicts a real child or not. This prohibition includes AI-generated child sexual abuse material and other pseudo imagery that may have been AI or computer generated.
We are committed to bringing to justice offenders who deliberately misuse AI to generate child sexual abuse material. We demonstrated this as part of the road to the AI Safety Summit, where we secured agreement from NGO, industry and international partners to take action to tackle AI-enabled child sexual abuse. The strongest protections in the Online Safety Act are for children, and all companies in scope of the legislation will need to tackle child sexual abuse material as a priority. Applications that use artificial intelligence will not be exempt and must incorporate robust guard-rails and safety measures to ensure that AI models and technology cannot be manipulated for child sexual abuse purposes.
Furthermore, I reassure noble Lords that the offence of taking, making, distributing and possessing with a view to distribution any indecent photograph or pseudophotograph of a child under the age of 18 carries a maximum sentence of 10 years’ imprisonment. Possession alone of indecent photographs or pseudophotographs of children can carry a maximum sentence of up to five years’ imprisonment.
However, I am not able to accept the amendment, as the current drafting would capture legitimate AI models that have been deliberately misused by offenders without the knowledge or intent of their creators to produce child sexual abuse material. It would also inadvertently criminalise individual users who possess perfectly legal digital files with no criminal intent, due to the fact that they could, when combined, enable the creation of child sexual abuse material.
I therefore ask the noble Baroness to withdraw the amendment, while recognising the strength of feeling and the strong arguments made on this issue and reiterating my offer to meet with her to discuss this ahead of Report.
I do not know how to express in parliamentary terms the depth of my disappointment, so I will leave that. Whoever helped the noble Viscount draft his response should be ashamed. We do not have a comprehensive system and the police do not have the capability; they came to me after months of trying to get the Home Office to act, so that is an untruth: the police do not have the capability.
I remind the noble Viscount that in previous debates his response on the bigger picture of AI has been to wait and see, but this is a here and now problem. As the noble Baroness, Lady Jones, set out, this would give purpose and reason—and here it is in front of us; we can act.
My Lords, I will speak to all the amendments in this group, other than Amendment 295 from the noble Baroness, Lady Jones. Without stealing her thunder, I very much support it, especially in an election year and in the light of the deepfakes we have already seen in the political arena—those of Sadiq Khan, those used in the Slovakian election and the audio deepfakes of the President of the US and Sir Keir Starmer. This is a real issue and I am delighted that she has put down this amendment, which I have signed.
In another part of the forest, the recent spread of deepfake photos purporting to show Taylor Swift engaged in explicit acts has brought new attention to the use, which has been growing in recent years, of deepfake images, video and audio to harass women and commit fraud. Women constitute 99% of the victims and the most visited deepfake site had 111 million users in October 2023. More recently, children have been found using “declothing” apps, which I think the noble Baroness mentioned, to create explicit deepfakes of other children.
Deepfakes also present a growing threat to elections and democracy, as I have mentioned, and the problems are increasingly rampant. Deepfake fraud rates rose by 3,000% globally in 2023, and it is hardly surprising that, in recent polling, 86% of the UK population supported a ban on deepfakes. I believe that the public are demanding an urgent solution to this problem. The only effective way to stop deepfakes, which is analogous to what the noble Baroness, Lady Kidron, has been so passionately advocating, is for the Government to ban them at every stage, from production to distribution. Legal liability must hold to account those who produce deepfake technology, create and enable deepfake content, and facilitate its spread.
Existing legislation seeks to limit the spread of images on social media, but this is not enough. The recent images of Taylor Swift were removed from X and Telegram, but not before one picture had been viewed more than 47 million times. Digital watermarks are not a solution, as shown by a paper by world-leading Al researchers released in 2023, which concluded that
“strong and robust watermarking is impossible to achieve”.
Without measures across the supply chain to prevent the creation of deepfakes, the law will forever be playing catch-up.
The Government now intend to ban the creation of sexual imagery deepfakes; I welcome this and have their announcement in my hand:
“Government cracks down on ‘deepfakes’ creation”.
This will send a clear message that the creation of these intimate images is not acceptable. However, this appears to cover only sexual image deepfakes. These are the most prevalent form of deepfakes, but other forms of deepfakes are also causing noticeable and rapidly growing harms, most obviously political deepfakes—as the noble Baroness, Lady Jones, will illustrate—and deepfakes used for fraud. This also appears to cover only the endpoint of the creation of deepfakes, not the supply chain leading up to that point. There are whole apps and companies dedicated to the creation of deepfakes, and they should not exist. There are industries which provide legitimate services—generative Al and cloud computing—which fail to take adequate measures and end up enabling creation of deepfakes. They should take measures or face legal accountability.
The Government’s new measures are intended to be introduced through an amendment to the Criminal Justice Bill, which is, I believe, currently between Committee and Report in the House of Commons. As I understand it, however, there is no date scheduled yet for Report, as the Bill seems to be caught in a battle over amendments.
The law will, however, be extremely difficult to enforce. Perpetrators are able to hide behind anonymity and are often difficult to identify, even when victims or authorities are aware that deepfakes have been created. The only reliable and effective countermeasure is to hold the whole supply chain responsible for deepfake creation and proliferation. All parties involved in the AI supply chain, from AI model developers and providers to cloud compute providers, must demonstrate that they have taken steps to preclude the creation of deepfakes. This approach is similar to how society combats—or, rather, analogous to the way that I hope the Minister will concede to the noble Baroness, Lady Kidron, society will combat—child abuse material and malware.
My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.
Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.
The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.
The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.
Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.
In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.
In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.
Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.
I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—
My Lords, I would be very happy to get a letter from the Minister.
I would be happy to write one. I will go for the abbreviated version of my speech.
I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.
The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.
Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.
Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.
I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.
Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.
For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.
My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.
Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.
My Lords, what a relief—we are at the final furlong.
The UK is a world leader in genomics, which is becoming an industry of strategic importance for future healthcare and prosperity, but, frankly, it must do more to protect the genomic sector from systemic competitors that wish to dominate this industry for either economic advantage or nefarious purposes. Genomic sequencing—the process of determining the entirety of an organism’s DNA—is playing an increasing role in our NHS, which has committed to being the first national healthcare system to offer whole-genome sequencing as part of routine care. However, like other advanced technologies, our sector is exposed to data privacy and national security risks. Its dual-use potential means that it can also be used to create targeted bioweapons or genetically enhanced military. We must ensure that a suitable data protection environment exists to maintain the UK’s world-leading status.
So, how are we currently mitigating against such threats and why is our existing approach so flawed? Although I welcome initiatives such as the Trusted Research campaign and the Research Collaboration Advice Team, these bodies focus specifically on research and academia. We expect foreign companies that hold sensitive genomics and DNA to follow GDPR. I am not a hawk about relations with other countries, but we need to provide the new Information Commissioner with much greater expertise and powers to tackle complex data security threats in sensitive industries. There must be no trade-off between scientific collaboration and data privacy; that is what this amendment is designed to prevent. I beg to move.
The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.
We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.
I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.
I would be extremely happy for the Minister to write.
Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.
I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.
My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.