(7 months, 1 week ago)
Lords ChamberThat this House do not insist on its Amendments 9 and 19, to which the Commons have disagreed for their Reason 19A.
My Lords, I will also speak to Motions A1, B, B1, C, C1, C2 and D.
I start by thanking noble Lords for their constructive input and careful scrutiny during the passage of the Bill. We have created legislation that will drive innovation and deliver better outcomes for consumers across the UK by addressing barriers to competition in digital markets and tackling consumer rip-offs.
The Bill has been strengthened in many places in this House. However, today, I will speak to Motions A to D, which address amendments that remain to be agreed across the Bill. The Government ask that this House does not insist on the amendments rejected in the other place and that it agrees to the amendment proposed in lieu of changes proposed by noble Lords.
Does the Minister not agree that since, with a merits appeal, a fine could be reduced to nugatory amounts, that what would be considered equivalent to a full merits review of the substantive decision?
That would be in respect only of the fine itself. Any other element of the decision, such as the imposition of new conduct requirements or other actions taken to correct anti-competitive effects in the market, would stand and would have been standing throughout the appeal in any event.
I turn to Motion B, which addresses Amendments 12 and 13, on the countervailing benefits exemption, moved by the noble Baroness, Lady Jones of Whitchurch. The amendment looks to revert the clause back to its original wording of
“the conduct is indispensable ... to … those benefits”.
The Government’s revised wording, which replaces “indispensable”, does not change the effect of the clause. It still requires the same high threshold to be met and has the same safeguards. To qualify for the exemption, SMS firms must establish that all the criteria are met. There must be no other reasonable, practicable way to achieve the same benefits to consumers with a less anti-competitive effect. I hope that noble Lords feel reassured that the Government’s drafting maintains the same robust threshold and keeps consumers at the heart of the pro-competition regime.
Your Lordships will remember Amendment 38, tabled by my noble friend Lord Lansley, which sought to place in the Bill a 40-day timeframe for the Secretary of State’s approval of CMA guidance. The Government listened carefully to concerns led by my noble friend relating to a risk of delay in the digital markets regime. We are absolutely committed to getting this regime up and running to start fixing competition problems and deliver greater consumer benefit.
To reinforce this commitment, the Government have tabled Amendment 38A in lieu. This takes the spirit of my noble friend’s amendment and merely adjusts the time limit to working days to align with other timelines in the Bill. It also asks for reasons if guidance is not approved within the time limit. I hope that this provides reassurances to noble Lords about our commitment to the digital markets regime. I thank my noble friend for championing this matter in earlier debates and for his support for the amendment in lieu.
Once again, I thank noble Lords for their contributions during the Bill’s passage and I look forward to others during this debate. Across this House, we are all committed to making the DMCC Bill the best and most effective legislation it can be. I therefore invite noble Lords to agree the government Motions before them. I beg to move.
Motion A1 (as an amendment to Motion A)
My Lords, I thank all noble Lords who have contributed to the debate today and, of course, throughout the development of this legislation. It has been a characteristically brilliant debate; I want to thank all noble Lords for their various and valuable views.
I turn first to the Motions tabled by the noble Lord, Lord Faulks, in relation to appeals and proportionality. I thank him for his continued engagement and constructive debate on these issues. We of course expect the CMA to behave in a proportionate manner at all times as it operates the digital market regime. However, today we are considering specifically the statutory requirement for proportionality in the Bill. We are making it clear that the DMU must design conduct requirements and PCIs to place as little burden as possible on firms, while still effectively addressing competition issues. The proposed amendments would not remove the reference to proportionality in Clause 21 and so, we feel, do not achieve their intended aim, but I shall set out the Government’s position on why proportionality is required.
On the question of the wording of “appropriate” versus “proportionate”, proportionality is a well-understood and precedented concept with a long history of case law. “Appropriate” would be a more subjective threshold, giving the CMA broader discretion. The Government’s position is that proportionality is the right threshold to be met in legislation due to the fact that it applies, in the vast majority of cases, because of ECHR considerations. It is the Government’s view that the same requirement for proportionality should apply whether or not ECHR rights are engaged.
As Article 1 of Protocol 1—A1P1—of the European Convention on Human Rights will apply to the vast majority of conduct requirements and PCIs imposed by the CMA, with the result that the courts will apply a proportionality requirement, we consider it important that it should be explicit that there is a statutory proportionality requirement for all conduct requirements and PCIs. We believe that proportionality should be considered beyond just those cases where A1P1 may apply, in particular when a conduct requirement or PCI would impact future contracts of an SMS firm.
The courts’ approach to proportionality in relation to consideration of ECHR rights has been set out by the Supreme Court, and we do not expect them to take a different approach here. Furthermore, the CAT will accord respect to the expert judgments of the regulator and will not seek to overturn its judgments lightly. I hope this answers the question put by the noble Lord, Lord Faulks.
On appeals, I thank noble Lords for their engagement on this matter, and in particular the noble Baroness, Lady Jones of Whitchurch, for setting out the rationale for her Amendments 32B and 32C, which seek to provide further clarity about where on the merits appeals apply. I want to be clear that the Government’s intention is that only penalty decisions will be appealable on the merits and that this should not extend to earlier decisions about whether an infringement occurred. I do not consider these amendments necessary, for the following reasons.
The Bill draws a clear distinction between penalty decisions and those about infringements, with these being covered by separate Clauses 89 and 103. There is a Court of Appeal precedent in BCL v BASF 2009 that, in considering a similar competition framework, draws a clear distinction between infringement decisions and penalty decisions. The Government consider that the CAT and the higher courts will have no difficulty in making this distinction for digital markets appeals to give effect to the legislation as drafted.
I now turn to the Motion tabled by the noble Lord, Lord Clement-Jones, in respect of the countervailing benefits exemption. I thank the noble Lord for his engagement with me and the Bill team on this important topic. The noble Lord has asked for clarification that the “indispensability” standard in Section 9 of the Competition Act 1998, and the wording,
“those benefits could not be realised without the conduct”,
are equivalent to each other. I want to be clear that the exemption within this regime and the exemption in Section 9 of the Competition Act 1998 are different. This is because they operate in wholly different contexts, with different criteria and processes. This would be the case however the exemption is worded in this Bill. That is why the Explanatory Notes refer to a “similar” exemption, because saying it is “equivalent” would be technically incorrect.
Having said that, the “indispensability” standard and the threshold of the Government’s wording,
“those benefits could not be realised without the conduct”,
are equally high. While the exemptions themselves are different, I hope I can reassure noble Lords that the Government’s view is that the standard—the height of the threshold—is, indeed, equivalent. The Government still believe that the clarity provided by simplifying the language provides greater certainty to all businesses, while ensuring that consumers get the best outcomes.
I thank the noble Lord, Lord Clement-Jones, for his question in relation to the Google privacy sandbox case. The CMA considers a range of consumer benefits under its existing consumer objective. This can include the privacy of consumers. It worked closely with the ICO to assess data privacy concerns in its Google privacy sandbox investigation and we expect it would take a similar approach under this regime.
I urge all noble Lords to consider carefully the Motions put forward by the Government and hope all Members will feel able—
Indeed. In principle I am very happy to update the Explanatory Notes, but I need to engage with ministerial colleagues. However, I see no reason why that would not be possible.
Meanwhile, I hope all noble Lords will feel able to support the Government’s position.
My Lords, before the Minister sits down, may I just press him on proportionality? I understand the argument to be that a proportionality test should be applied in this context even though it is not required in all cases by the European Convention on Human Rights. I see the Minister nodding. Will that now be the general position of the Government, because it is not the law in relation to judicial review generally that there is a proportionality test? If that is to the position of the Government, it would be a very significant development which some of us would welcome and some of us would not. I declare an interest, of course, as one of those lawyers referred to by the noble Baroness, Lady Jones, as looking to take advantage on behalf of their clients. It is a very real issue; how far does this go?
It goes only so far as its application to the Bill now. I am not aware of any further measures to take it into other Bills and would not expect to see any.
My Lords, I am grateful for the Minister’s response on that issue. I asked him the same question that I have asked throughout these proceedings—it is the same question posed by the noble Lord, Lord Pannick—and there does not seem, with great respect, to be an answer to it. The Minister has mostly allowed, to use a cricketing metaphor, the matter to go past the off stump without playing a shot. What really seems to be the position is that he says that proportionality will apply, even if the Human Rights Act or a convention right is not involved. But I think that, in answer to the noble Lord, Lord Pannick, the Minister is saying, “But only in the case of this Bill”. What that means is that big tech is getting a special privilege not afforded to any other litigant in any other context. I ask noble Lords, “Is that a good look?” I do not think that it is.
The Commons reason for preferring “proportionate” to “appropriate” reads as follows:
“Because it is appropriate for the CMA to be required to act proportionately in relation to conduct requirements and pro-competition interventions”.
I do not know whether that was supposed to be a joke, but it is profoundly unsatisfactory. The Government have missed a trick—or rather, they have succumbed to considerable pressure. I welcome the Bill because there is a great deal about it which is good. Having thought very carefully, and with considerable reluctance, I propose to withdraw my amendment.
That this House do not insist on its Amendments 12 and 13, to which the Commons have disagreed for their Reason 13A.
My Lords, I have already spoken to Motion B. I beg to move.
Motion B1 (as an amendment to Motion B)
Tabled by
Leave out from “House” to end and insert “do not insist on its Amendment 12, to which the Commons have disagreed for their Reason 13A, and do insist on its Amendment 13.”
That this House do not insist on its Amendments 26, 27, 28, 31 and 32, to which the Commons have disagreed for their Reason 32A.
That this House do not insist on its Amendment 38, and do agree with the Commons in their Amendment 38A in lieu.
(7 months, 1 week ago)
Lords ChamberMy Lords, I regret that I was unable to speak at Second Reading of the Bill. I am grateful to the government Benches for allowing my noble friend Lady Twycross to speak on my behalf on that occasion. However, I am pleased to be able to return to your Lordships’ House with a clean bill of health, to speak at Third Reading of this important Bill. I congratulate the noble Lord, Lord Holmes of Richmond, on the progress of his Private Member’s Bill.
Having read the whole debate in Hansard, I think it is clear that there is consensus about the need for some kind of AI regulation. The purpose, form and extent of this regulation will, of course, require further debate. AI has the potential to transform the world and deliver life-changing benefits for working people: whether delivering relief through earlier cancer diagnosis or relieving traffic congestion for more efficient deliveries, AI can be a force for good. However, the most powerful AI models could, if left unchecked, spread misinformation, undermine elections and help terrorists to build weapons.
A Labour Government would urgently introduce binding regulation and establish a new regulatory innovation office for AI. This would make Britain the best place in the world to innovate, by speeding up decisions and providing clear direction based on our modern industrial strategy. We believe this will enable us to harness the enormous power of AI, while limiting potential damage and malicious use, so that it can contribute to our plans to get the economy growing and give Britain its future back.
The Bill sends an important message about the Government’s responsibility to acknowledge and address how AI affects people’s jobs, lives, data and privacy, in the rapidly changing technological environment in which we live. Once again, I thank the noble Lord, Lord Holmes of Richmond, for bringing it forward, and I urge His Majesty’s Government to give proper consideration to the issues raised. As ever, I am grateful to noble Lords across the House for their contributions. We support and welcome the principles behind the Bill, and we wish it well as it goes to the other place.
My Lords, I too sincerely thank my noble friend Lord Holmes for bringing forward the Bill. Indeed, I thank all noble Lords who have participated in what has been, in my opinion, a brilliant debate.
I want to reassure noble Lords that, since Second Reading of the Bill in March, the Government have continued to make progress in their regulatory approach to artificial intelligence. I will take this opportunity to provide an update on just a few developments in this space, some of which speak to the measures proposed by the Bill.
First, the Government want to build public visibility of what regulators are doing to implement our pro-innovation approach to AI. Noble Lords may recall that we wrote to key regulators in February asking them for an update on this. Regulators have now published their updates, which include an analysis of AI-related opportunities and risks in the areas that they regulate, and the actions that they are taking to address these. On 1 May, we published a GOV.UK page where people can access each regulator’s update.
We have taken steps to establish a multidisciplinary risk-monitoring function within the Department for Science, Innovation and Technology, bringing together expertise in risk, regulation and AI. This expertise will provide continuous examination of cross-cutting AI risks, including evaluating the effectiveness of interventions by government and regulators.
Before the noble Viscount sits down, he listed a whole series of activities that are very welcome, but I said at Second Reading that I felt the Government were losing momentum, because the Prime Minister had set an international lead: the United Kingdom was going to lead the world and would be an example to everybody. It seems, with the Minister’s statement, that we have slipped back now. The European Union has set out its stall. If we are not going to have a legislative framework, we need to know that. I just hope the Government will reflect that the position the Prime Minister adopted at the beginning of this process was innovative, positive and good for the United Kingdom as a whole, but I fear that the loss of momentum means we will be slipping back down at a very rapid rate.
I thank the noble Lord for his comments. I am not sure I accept the characterisation of a loss of momentum. We are, after all, co-hosting the AI safety summit along with our Korean friends in a couple of weeks. On moving very quickly to legislation, it has always been the Government’s position that it is better to have a deeper understanding of the specific risks of AI across each sector and all sectors before legislating too narrowly, and that there is a real advantage to waiting for the right moment to have judicious legislation that addresses specific risks, rather than blanket legislation that goes to all of them.
(7 months, 2 weeks ago)
Lords ChamberMy Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare my technology interests, as set out in the register, as an adviser to Boston Ltd.
My Lords, this is a complex and challenging area. We strongly support AI innovation in the UK, but this cannot be at the expense of our world-leading creative industries. Our goal is that both sectors should be able to grow together in partnership. We are currently working with DCMS to develop a way forward on copyright and AI. We will engage closely with interested stakeholders as we develop our approach.
My Lords, our great UK creatives—musicians who make such sweet sounds where otherwise there may be silence; writers who fill the blank page with words of meaning that move us; our photographers; our visual artists—are a creative community contributing billions to the UK economy, growing at twice the rate of the UK economy. In the light of this, why are the Government content for their work, their IP and their copyright to be so disrespected and unprotected in the face of artificial intelligence?
I thank my noble friend for the important point he raises, particularly stressing the importance to the United Kingdom of the creative industry, which contributes 6% of our GVA every year. The Government are far from content with this position and share the frustrations and concerns of many across the sector in trying to find a way forward on the AI and copyright issue. As I say, it is challenging and deeply complex. No jurisdiction anywhere has identified a truly satisfactory solution to this issue, but we continue to work internationally and nationally to seek one.
My Lords, I am grateful to my noble friend for giving way. As I said in my letter last week to the Secretary of State on behalf of the Communications and Digital Select Committee, the Government’s reluctance to take a clear position on copyright in the context of AI and large language models is leading to
“problematic business models … becoming entrenched and normalised”.
The Government urgently need to take a clear position, and soon. On a practical basis, what support are they giving to market-led initiatives to improve licensing deals for news publishers and to get collective licensing regimes off the ground, to ensure that smaller rights-holders are also not left behind?
I thank my noble friend and her committee for that important letter. First, we must not underestimate the difficulty and complexity of the issues involved in resolving this question; there are very problematic jurisdictional and technical issues. That said, the Government greatly welcome any arrangement between private sector organisations finding a way forward on this; we can all learn a great deal from the success of those arrangements. We believe that a collaborative way forward on both sides, in partnership, will be a very important part of the eventual solution.
My Lords, the Minister was right to say that we should recognise that AI can bring opportunities to the creative sector. For example, nearly a decade after a near-fatal stroke, the musician Randy Travis has released a new song featuring AI-generated vocals. This has been done with his consent and the involvement of his record label, but elsewhere, as we have heard, AI tools are being widely used to create music in the style of established artists, despite no permission having been given and a total lack of creative control on the part of those artists and their representatives. Can the Minister outline how the Government are actively involving musicians, artists and writers in determining how best to protect that very precious intellectual property, while allowing creativity to flourish? I echo the noble Baroness’s theme: this is an urgent matter and we would like to hear how the Government will address it.
The issue raised by the noble Baroness is of deep concern to everybody. As I say, there are some very serious problems, not least regarding the jurisdiction where any alleged infringement may or may not have taken place. Of course, any jurisdiction that implements rules one way or the other will find that the AI work she sets out so compellingly is simply offshored elsewhere. The Government engage very closely with creative groups, including fair remuneration groups for musicians and many others, and will continue to do so, looking for a solution to this difficult problem.
My Lords, the noble Viscount told the Lords Communications and Digital Select Committee that he did not
“believe that infringing the rights of copyright holders is a necessary precondition for developing successful AI”.
Does he still hold to that view, and does he accept that it should be the clearly stated view of the Government?
I do not believe that the AI industry, in the long term, will require long-standing copyright infringement for its success. That is and continues to be the Government’s view; any unauthorised use or copying of intellectual property or copyrighted material is an infringement. Of course, there is a range of exceptions to that, and there is the possibility of giving permission for that to happen. It becomes a very complex area, both legally and technologically, but, as I say, we continue to look for a solution.
I am most grateful. AI is already creating IP of its own, but it is unable to register it because, by law, a human being needs to register IP. In trying to create a legislative bottle into which this genie could be reinserted, have the Government taken that into account?
The noble Lord raises a very interesting question. The laws surrounding the copyrighting of machine-generated content are getting fairly elderly now and certainly need to be looked at as part of the overall position going forward.
My Lords, it is clear from the questions being raised that this is a very complex area, not least because we are bringing together the creative and legal industries and the technologists who are programming, developing and trying to stay ahead of the curve on this. What plans do the two departments involved in developing the code of practice have urgently to engage with industry—especially over the summer, when there will be a number of events and activities, including London Tech Week—so that we can more quickly develop the code and other requirements?
Perhaps my noble friend will forgive me if I gaze into a crystal ball for a moment and predict that the eventual solution to this will involve three elements: first, some modifications to our copyright legislation; secondly, some use of technology to enable a solution; and thirdly, internationally accepted standards of interoperability in any eventual solution. We engage widely with techUK and other technology partners, but above all we engage extensively internationally. I point to our specific engagements with the World Intellectual Property Organization, the UN agency the ITU, and of course the follow-up to the AI Safety Summit, which we are co-hosting in Seoul in a couple of weeks’ time.
My Lords, what action are the Government taking to compel AI companies to implement measures to monitor and report IP infringements?
One of the principles we set out in our AI White Paper is transparency. That principle—repeated across the OECD and in the EU’s AI Act—will go a long way towards doing what the noble Baroness asks. There are, though, a number of technical difficulties in implementing transparency—not legally, from our side, but rather, the computer science problems associated with processing the very large quantities of data required to generate true transparency.
My Lords, a lot of people are excited by the prospects for AI. Indeed, this country is in the lead in developing such policies and the associated opportunities. As one of those involved in preparing the GDPR in Brussels, I am concerned that the opportunities and excitement associated with the use of AI must be balanced against the protection of individual privacy and the rights of corporate structures and individuals who are worried about the abuses that might occur unless legislators are up to date and moving fast enough to deal with these matters.
My noble friend makes some important points: AI must advance on the back of well-executed data protection. Let me take the opportunity to thank him for his outstanding contributions during the recently completed Committee stage of the Data Protection and Digital Information Bill. We continue to share the goal that he set up.
(7 months, 2 weeks ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to ensure political deepfakes on social media are not used to undermine the outcome of the general election.
My Lords, we are working to ensure we are ready to respond to the full range of threats to our democratic processes, including through the Defending Democracy Taskforce. It is already an election offence to make false statements of fact about the personal character or conduct of a candidate before or during an election. Additionally, under the Online Safety Act, where illegal political deepfakes are shared on social media, they must be removed.
My Lords, Google’s Kent Walker has talked of the “very serious” threat posed by AI-generated deepfakes and disinformation. The Prime Minister, the Leader of the Opposition and the Mayor of London have all been the subject of deepfakes, so it is not surprising that the Home Secretary has identified a critical window for collective action to preserve the integrity of the forthcoming election. Obviously, monitoring online content is important, but that will not prevent malign individuals or hostile foreign states trying to interfere in the forthcoming elections at home and abroad. Will the Minister finally take up our proposals to use the Data Protection Bill to fill the deepfake gap left by the Online Safety Act so that we can all have confidence in the outcome of the general election?
I start by saying that I very much share the view of the importance of protecting the forthcoming general election—and indeed every election—from online deepfakes, whether generated by AI or any other means. I think it is worth reminding the House that a range of existing criminal offences, such as the foreign interference offence, the false communications offence and offences under the Representation of the People Act, already address the use of deepfakes to malignly influence elections. While these Acts will go some way to deterring, I also think it is important to remind the House of the crucial non-legislative measures that we can take, continue to take and will take up to the completion of the election.
My Lords, would my noble friend not agree that there is an issue regarding the distortion of what politicians say, both through video and through the written word? Would he give me some indication of what the position is regarding Hansard and the coverage of what is said in this House and in the other place? Are we sufficiently protected if that written record is distorted or abused by others in the media?
Indeed—and let me first thank my noble friend for bringing up this important matter. That sounds to me like something that would be likely to be applied under the false communications offence in the Online Safety Act—Section 179—although I would not be able to say for sure. The tests that it would need to meet are that the information would have to be knowingly false and cause non-trivial physical or psychological harm to those offended, but that would seem to be the relevant offence.
My Lords, does not the Question from the noble Baroness, Lady Jones, highlight that we must hold to account with legal liability not only those who create this kind of deepfake content and facilitate its spread, but those who enable the production of deepfakes with software, such as by having standards and risk-based regulation for generative AI systems, which the Government in their White Paper have resolutely refused to do?
The Government set out in their White Paper response that off-the-shelf AI software that can in part be used to create these kinds of deepfakes is not, in and of itself, something that we are considering placing any ban on. However, there are ranges of software, a sort of middle layer to the AI production, that can greatly facilitate the production of deepfakes of all kinds, not just political but other kinds of criminal deepfakes—and there the Government would be actively considering moving against those purpose-built criminal tools.
My Lords, given the use of deepfakes and malign disinformation facilitated by data theft, has the noble Viscount taken note of what the Biden Administration decided to do last week? The President signed into law the ability to ban TikTok, and the Chinese-owned company that owns it, because of America’s experience in the mid-term elections in 2022 and the elections in Taiwan earlier this year. Does the Minister not worry that, unless we take similar powers in the United Kingdom, the same thing will happen here?
Well, some of the enforcement measures under the Online Safety Act do allow for very significant moves against social media platforms that misuse their scale and presence to malign ends in this way, but of course the noble Lord is absolutely right and we will continue to look closely at the moves by the Biden Administration to see what we can learn from them for our approach.
My Lords, I pay tribute to Andy Street for the way he responded to the circumstances in what was an incredibly close race. He must have been hugely disappointed. Sadly, another candidate in that race has since made false accusations of racism against a Labour volunteer, posting the volunteer’s name, picture and social media account, with the result that the volunteer subsequently received death threats in both calls and emails. Will the Minister join all noble Lords in condemning this kind of behaviour and confirm that, in his view, attacking party volunteers falls fully within the range of threats to the democratic process?
First, let me absolutely endorse the noble Lord’s sentiment: this is a deplorable way to behave that should not be tolerated. From hearing the noble Lord speak of the actions, my assumption is that they would fall foul of the false communications offence under Section 179 of the Online Safety Act. As I say, these actions are absolutely unacceptable.
My Lords, noble Lords will be aware of the threat of AI-generated deepfake election messages flooding the internet during an election campaign. At the moment, only registered users have to put a digital imprint giving the provenance of the content on unpaid election material. Does the Minister think that a requirement to put a digital imprint on all unpaid election material should be introduced to counter fake election messages?
The noble Viscount is right to point to the digital imprint regime as one of the tools at our disposal for limiting the use of deepfakes. I think we would hesitate to have a blanket law that all materials of any kind would be required to have a digital imprint on them—but, needless to say, we will take away the idea and consider it further.
My Lords, if, at the very height of the forthcoming general election, deepfakes were to emerge, what would be the role of Ofcom, in particular regarding the taking down of material that is manifestly false? Does Ofcom have the resources necessary to do this?
In the regrettable scenario mentioned by the noble Lord, such actions would generally fall to the Joint Election Security and Preparedness Unit and the election cell that will have been set up for the duration of the election to conduct rapid operational rebuttal and other responses to such things. We would not necessarily look to Ofcom until after the event because of the speed at which things would have to move.
My Lords, it is not just technology that can undermine the outcome of general elections; the Government are facilitating it, too. Jacob Rees-Mogg, former Business Secretary, famously said that voter ID rules were an attempt to “gerrymander” the electoral system. Does the Minister have any empirical evidence to show that the introduction of the voter ID system has reduced alleged fraud or encouraged more people to vote?
It is a very interesting question, but I am afraid I have no information on that as it is not DSIT’s area at all. I will be very happy to find out and write to the noble Lord if that would help.
(7 months, 3 weeks ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to protect sensitive research at universities from national security threats.
The Government are implementing a range of legislative and non-legislative measures, including the Research Collaboration Advice Team, which provides advice to academia on national security risks in international collaboration. The integrated review refresh committed to review the effectiveness of existing protections. The Department for Science, Innovation and Technology is leading this review, and the Deputy Prime Minister announced last week that the Government will consult on the response in the summer.
I am grateful to my noble friend, but are our universities not compromising their independence by becoming overreliant on China? Some 25% of the students, or 10,000, at UCL are Chinese, which risks the infiltration of academic research and, in the words of the Deputy Prime Minister, coercion, exploitation and vulnerability. While I welcome the recent Statement, what steps will the Government take to replace lost Chinese funding for our universities, so that the UK remains at the forefront of technological research?
I thank my noble friend for the question. The first thing to say is that the independence of universities is absolutely critical to the quality of their research. While the integrated review refresh has of course indicated a great many concerns about working closely with China, and necessitated a reduction of academic collaboration with China, I hope our recent reassociation to the Horizon programme, and a number of other third countries also considering or being very close to associating with Horizon, will go some way towards providing a new pool of collaboration partners in academic research.
My Lords, I am sure that all of us agree with the noble Lord, Lord Young, that we need to protect scientific development from malign actors. But is there not a real problem here—that new technology and advances in scientific knowledge not only require international collaboration, on a scale hitherto unknown, but that most of it, ever since the bow and arrow, is dual-purpose? In other words, it can be used for benevolent or malign reasons. How do the departments charged with this responsibility distinguish between these two, so that in protecting us from the misuse of scientific advances, they are not smothering scientific research as a whole?
The noble Lord is absolutely right in his analysis of the problem, which I agree with wholeheartedly. The most powerful tool we have at our disposal in this is RCAT—the Research Collaboration Advice Team—which provides hundreds of individual items of advice in these areas, where it can actually be quite subtle whether something is dual or single-use or has a military or defence application. It is not something that can be very easily defined up front, and does require a certain wisdom and delicacy of advice to provide that.
My Lords, last week the Statement did not seem to say very much about which actors might be under consideration. The noble Lord, Lord Young, has already mentioned China, but do His Majesty’s Government also think that Iran and other countries might be a problem—not by giving funding, but by researchers and students coming? If that is the case, can His Majesty’s Government really expect universities to vet individuals? Is that not the role for government? I declare my interest as a professor at Cambridge.
The noble Baroness raises a very important point; it is not about naming one or more countries and targeting them. The non-legislative and legislative elements of the entire approach to this are about being actor agnostic, and simply looking at the cases as they arise.
My Lords, further to the points made by my noble friend, the Government said they are taking a range of measures, but if you take an area like biosecurity, which I am sure the Minister will agree is a very significant potential future threat, with people perhaps developing pathogens, aided possibly by using AI technology to do them more easily and quickly, is there not a case for mandatory surveillance over, for example, access to materials, which would indicate where somebody might be trying to do something that has that dual purpose—in other words, something bad rather than something good? Does the Minister agree that a voluntary scheme, such as I understand exists at the moment, may not be enough?
Indeed, and we must recognise that there are limits to a voluntary scheme, particularly where actors are genuinely malign. I reassure the noble Viscount that any research contracted for purposes of defence, or indeed for purposes that might be used for defence, would be subject to vetting in the usual way. Depending on the nature of the research, the greater the vetting.
My Lords, I declare an interest as an honorary fellow of the University of Strathclyde. This challenge to our universities is both fast-moving and intensifying in complexity. Now, the Russell group comprises some universities across the United Kingdom, but not all. Universities UK represents many universities across the United Kingdom, but not all. Is there, or are there plans for, a United Kingdom Government security portal, accessible to all universities across the United Kingdom, for immediate advice and information, if they have concerns?
I thank my noble friend for that. Yes, the university sector absolutely does go far beyond just the Russell group. We must take account of all its needs. The review of protections for higher education and academia is now entering its second phase. There will be consultation on that over the summer. An area it will look at is precisely the mechanics that my noble friend puts forward as to how this kind of transparency can best be delivered with the minimum possible administrative overhead.
My Lords, does the noble Viscount recall that, as long ago as September 2023, his noble friend Lord Johnson of Marylebone, in conjunction with King’s College, produced a report warning about the dangers which the noble Lord, Lord Young of Cookham, mentioned to the House? It called for diversification of the population base of our universities, which had become too reliant on money flowing in from China. Will he also comment on the case that was raised in the media last month of Professor Michelle Shipworth, who was banned from teaching what was called a “provocative” course at a prestigious university, UCL, simply because it might compromise commercial interests—that is, the flow of money from China?
I certainly recognise the concern that overseas undergraduates tend to come very largely from a small number of countries, and the value of diversifying from that. I am afraid I am not familiar with the case the noble Lord mentions. I am very happy to write to him about it. It sounds extremely concerning.
My Lords, upholding national security is the first duty of any Government. To that end, we welcome the Government’s recent briefing for vice-chancellors and the intention to consult on how better to protect UK research from academic espionage. Given the importance of and the likely increase in these threats, does the Minister think it would be reasonable for the Deputy Prime Minister and the Secretary of State to offer similar briefings to their shadow counterparts?
I would be very happy to raise that with them and ask them to do so. I take the noble Baroness’s point. There is nothing more important for us to do than look after our security, and research security is a very serious component of that.
Would the Minister recognise that it is extremely important that his department works closely with the Home Office on this? I noticed last week the warning, from my successor but three at MI5, to vice-chancellors of the threat from Chinese espionage in universities, much of which will be by students under coercion. If I may answer the noble Baroness’s question about who you can go to, there is an organisation but such is my senility that I cannot remember its name. I will look it up. It is connected very closely to MI5, but it is the public-facing organisation to which you go with concerns. It starts “National Protective Security”, I think, but a quick look on my telephone has not revealed the answer, so I will talk to her later. The Minister probably knows the answer, but I am afraid I do not.
I am consulting the lengthy list of acronyms that I wrote down in preparing for this, but I am not sure I have the right one. I take the noble Baroness’s point very seriously. We work extremely closely on this with the Home Office. A number of the legislative provisions keeping our research secure belong to the Home Office and we continue to work closely with it. As to the exact agency she mentioned, I will find out from my officials and write to her.
(7 months, 4 weeks ago)
Grand CommitteeMy Lords, having listened carefully to representations from across the House at Second Reading, I am introducing this amendment to address concerns about the data preservation powers established in the Bill. The amendment provides for coroners, and procurators fiscal in Scotland, to initiate the data preservation process when they decide it is necessary and appropriate to support their investigations into a child’s death, irrespective of the suspected cause of death.
This amendment demonstrates our commitment to ensuring that coroners and procurators fiscal can access the online data they may need to support their investigation into a child’s death. It is important to emphasise that coroners and procurators fiscal, as independent judges, have discretion about whether to trigger the data preservation process. We are grateful to the families, Peers and coroners whom we spoke to in developing these measures. In particular, I thank the noble Baroness, Lady Kidron, who is in her place. I beg to move.
My Lords, it is an unusual pleasure to support the Minister and to say that this is a very welcome amendment to address a terrible error of judgment made when the Government first added the measure to the Bill in the other place and excluded data access for coroners in respect of children who died by means other than suicide. I shall not replay here the reasons why it was wrong, but I am extremely glad that the Government have put it right. I wish to take this opportunity to pay tribute to those past and present at 5Rights and the NSPCC for their support and to those journalists who understood why data access for coroners is a central plank of online safety.
I too recognise the role of the Bereaved Families for Online Safety. They bear the pain of losing a child and, as their testimony has repeatedly attested, not knowing the circumstances surrounding that death is a particularly cruel revictimisation for families, who never lose their grief but simply learn to live with it. We owe them a debt of gratitude for putting their grief to work for the benefit of other families and other children.
My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.
Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words
“or are due to conduct an investigation”
are indeed superfluous.
We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.
There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.
Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.
Let me begin by reiterating my thanks to the noble Baroness, Peers, families and coroners for their help in developing these measures. My momentary pleasure in being supported on these amendments is, of course, tempered by the desperate sadness of the situations that they are designed to address.
I acknowledge the powerful advocacy that has taken place on this issue. I am glad that we have been able to address the concerns with the amendment to the Online Safety Act, which takes a zero-tolerance approach to protecting children by making sure that the buck stops with social media platforms for the content they host. I sincerely hope that this demonstrates our commitment to ensuring that coroners can fully access the online data needed to provide answers for grieving families.
On the point raised by the noble Baroness, Lady Kidron, guidance from the Chief Coroner is likely to be necessary to ensure both that this provision works effectively and that coroners feel supported in their decisions on whether to trigger the data preservation process. Decisions on how and when to issue guidance are a matter for the Chief Coroner, of course, but we understand that he is very likely to issue guidance to coroners on this matter. His office is working with my department and Ofcom to ensure that our processes are aligned. The Government will also work with the regulators and interested parties to see whether any guidance is required to support parents in understanding the data preservation process. Needless to say, I would be more than happy to arrange a meeting with the noble Baroness to discuss the development of the guidance; other Members may wish to join that as well.
Once again, I thank noble Lords for their support on this matter.
My Lords, I now turn to the national underground asset register, which I will refer to as NUAR. It is a new digital map of buried pipes and cables that is revolutionising the way that we install, maintain, operate and repair our buried infrastructure. The provisions contained in the Bill will ensure workers have complete and up-to-date access to the data that they need, when they need it, through the new register. NUAR is estimated to deliver more than £400 million per year of economic growth through increased efficiency, reduced accidental damage and fewer disruptions for citizens and businesses. I am therefore introducing several government amendments, which are minor in nature and aim to improve the clarity of the Bill. I hope that the Committee will be content if I address these together.
Amendment 244 clarifies responsibilities in relation to the licensing of NUAR data. As NUAR includes data from across public and private sector organisations, it involves both Crown and third-party intellectual property rights, including database rights. This amendment clarifies that the role of the Keeper of the National Archives in determining the licence terms for Crown IP remains unchanged. This will require the Secretary of State to work through the National Archives to determine licence terms for Crown data, as was always intended. Amendments 243 and 245 are consequential to this change.
Similarly, Amendment 241 moves the provision relating to the first initial upload of data to the register under new Part 3A to make the Bill clearer, with Amendments 248 and 249 consequential to this change.
Amendment 242 is a minor and technical amendment that clarifies that regulations made under new Section 106B(1) can be made “for or in connection with”—rather than solely “in connection with”—the making of information kept in NUAR available, with or without a licence.
Amendment 247 is another minor and technical amendment to ensure that consistent language is used throughout Schedule 13 and so further improve the clarity of these provisions. These amendments provide clarity to the Bill; they do not change the underlying policy.
Although Amendment 298 is not solely focused on NUAR, this might perhaps be a convenient point for me to briefly explain it to your Lordships. Amendment 298 makes a minor and technical amendment to Clause 154, the clause which sets out the extent of the Bill. Subsection (4) of that clause currently provides that an amendment, repeal or revocation made by the Bill
“has the same extent as the enactment amended, repealed or revoked”.
Subsection (4) also makes clear that this approach is subject to subsection (3), which provides for certain provisions to extend only to England and Wales and Northern Ireland. Upon further reviewing the Bill, we have identified that subsection (4) should, of course, also be subject to subsection (2), which provides for certain provisions to extend only to England and Wales. Amendment 298 therefore makes provision to ensure that the various subsections of Clause 154 operate effectively together as a coherent package.
I now turn to a series of amendments raised by the noble Lord, Lord Clement-Jones. Amendments 240A and 240B relate to new Section 106A, which places a duty on the Secretary of State to keep a register of information relating to apparatus in streets in England and Wales. Section 106A allows for the Secretary of State to make regulations that establish the form and manner in which the register is kept. The Bill as currently drafted provides for these regulations to be subject to the negative procedure. Amendment 240A calls for this to be changed to the affirmative procedure, while Amendment 240B would require the publication of draft regulations, a call for evidence and the subsequent laying before Parliament of a statement by the Secretary of State before such regulations can be made.
I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.
While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.
It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.
NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—
I am concerned. We get the principle behind NUAR, but is there an interface between NUAR and this other service—which, on the face of it, looks quite extensive—currently in place? Is there a dialogue between the two? That seems to be quite important, given that there is some doubt over NUAR’s current scope.
I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.
My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a
“sustainably-funded UK success story”—
okay, give or take a bit of puff—that
“responds to most requests in 5 minutes or less”.
It has
“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.
That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.
As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.
My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?
Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.
Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.
I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.
All I can say is that, in that case, the Minister has been worked on extremely well.
In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.
My Lords, it took them half a day to discover where the hole had gone and what the damage was. The water flooded several main roads and there were traffic delays and the rest. So these things are very serious. I was trying to make a serious point while being slightly frivolous about it.
No, indeed, it is a deeply serious point. I do not know the number off the top of my head but there are a number of deaths every year as a result of these things.
As I was saying, a thorough impact assessment was undertaken for the NUAR measures, which received a green rating from the Regulatory Policy Committee. Impacts on organisations that help facilitate the exchange of data related to assets in the street were included in the modelling. Although NUAR could impact existing utility—
I cannot resist drawing the Minister’s attention to the story in today’s Financial Times, which reports that two major water companies do not know where their sewers are. So I think the impact is going to be a little bit greater than he is saying.
I saw that story. Obviously, regardless of how they report the data, if they do not know, they do not know. But my thought was that, if there are maps available for everything that is known, that tends to encourage people who do not know to take better control of the assets that they manage.
A discovery project is under way to potentially allow these organisations—these alternative providers—to access NUAR data; LSBUD has been referenced, among others. It attended the last three workshops we conducted on this, which I hope could enable it to adapt its services and business models potentially to mitigate any negative impacts. Such opportunities will be taken forward in future years should they be technically feasible, of value, in the public interest and in light of the views of stakeholders, including asset owners.
A national underground asset register depends on bringing data together from asset owners on to a single standardised database. This will allow data to be shared more efficiently than was possible before. Asset owners have existing processes that have been developed to allow them to manage risks associated with excavations. These processes will be developed in compliance with existing guidance in the form of HSG47. To achieve this, those working on NUAR are already working closely with relevant stakeholders as part of a dedicated adoption group. This will allow for a safe and planned rollout of NUAR to those who will benefit from it.
Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.
As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.
I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.
My Lords, I support this probing amendment, Amendment 251. I thank all noble Lords who have spoken. From this side of the Committee, I say how grateful we are to the noble Lord, Lord Arbuthnot, for all that he has done and continues to do in his campaign to find justice for those sub-postmasters who have been wronged by the system.
This amendment seeks to reinstate the substantive provisions of Section 69 of PACE, the Police and Criminal Evidence Act 1984, revoking this dangerous assumption. I would like to imagine that legislators in 1984 were perhaps alert to the warning in George Orwell’s novel Nineteen Eighty-Four, written some 40 years earlier, about relying on an apparently infallible but ultimately corruptible technological system to define the truth. The Horizon scandal is, of course, the most glaring example of the dangers of assuming that computers are always right. Sadly, as hundreds of sub-postmasters have known for years, and as the wider public have more recently become aware, computer systems can be horribly inaccurate.
However, the Horizon system is very primitive compared to some of the programs which now process billions of pieces of our sensitive data every day. The AI revolution, which has already begun, will exponentially accelerate the risk of compounded errors being multiplied. To take just one example, some noble Lords may be aware of the concept of AI hallucinations. This is a term used to describe when computer models make inaccurate predictions based on seeing incorrect patterns in data, which may be caused by incomplete, biased or simply poor-quality inputs. In an earlier debate, the noble Viscount, Lord Younger of Leckie, said that account information notices will be decided. How will these decisions be made? Will they be made by individual human beings or by some AI-configured algorithms? Can the Minister share with us how such decisions will be taken?
Humans can look at clouds in the sky or outlines on the hillside and see patterns that look like faces, animals or symbols, but ultimately we know that we are looking at water vapour or rock formations. Computer systems do not necessarily have this innate common sense—this reality check. Increasingly, we will depend on computer systems talking to each other without any human intervention. This will deliver some great efficiencies, but it could lead to greater injustices on a scale which would terrify even the most dystopian science fiction writers. The noble Baroness, Lady Kidron, has already shared with us some of the cases where a computer has made errors and people have been wronged.
Amendment 251 would reintroduce the opportunity for some healthy human scepticism by enabling the investigation of whether there are reasonable grounds for questioning information in documents produced by a computer. The digital world of 2024 depends more on computers than the world of Nineteen Eighty-Four in actual legislation or in an Orwellian fiction. Amendment 251 enables ordinary people to question whether our modern “Big Brother” artificial intelligence is telling the truth when he or it is watching us. I look forward to the Minister’s responses to all the various questions and on the current assumption in law that information provided by the computer is always accurate.
My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.
There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.
On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.
This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.
We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.
This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.
Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.
Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.
It may have already done so, but I will certainly pass that on.
I thank everyone who spoke and the Minister for the offer of a meeting alongside his colleagues from the MoJ. I believe he will have a very busy diary between Committee and Report, based on the number of meetings we have agreed to.
However, I want to be very clear here. We have all recognised that the story of the Post Office sub-postmasters makes this issue clear, but it is not about the sub-postmasters. I commend the Government for what they are doing. We await the inquiry with urgent interest, and I am sure I speak for everyone in wishing the sub-postmasters a fair settlement—that is not in question. What is in question is the fact that we do not have unlimited Lord Arbuthnots to be heroic about all the other things that are about to happen. I took it seriously when he said not one moment longer: it could be tomorrow.
My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.
As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.
Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.
As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.
Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.
I thank the noble Baroness, Lady Kidron, for tabling Amendment 291, which would create several new criminal offences relating to the use of AI to collect, collate and distribute child abuse images or to possess such images after they have been created. Nobody can dispute the intention behind this amendment.
We recognise the importance of this area. We will continue to assess whether and what new offences are needed to further bolster the legislation relating to child sexual abuse and AI, as part of our wider ongoing review of how our laws need to adapt to AI risks and opportunities. We need to get the answers to these complex questions right, and we need to ensure that we are equipping law enforcement with the capabilities and the powers needed to combat child sexual abuse. Perhaps, when I meet the noble Baroness, Lady Kidron, on the previous group, we can also discuss this important matter.
However, for now, I reassure noble Lords that any child sex abuse material, whether AI generated or not, is already illegal in the UK, as has been said. The criminal law is comprehensive with regard to the production and distribution of this material. For example, it is already an offence to produce, store or share any material that contains or depicts child sexual abuse, regardless of whether the material depicts a real child or not. This prohibition includes AI-generated child sexual abuse material and other pseudo imagery that may have been AI or computer generated.
We are committed to bringing to justice offenders who deliberately misuse AI to generate child sexual abuse material. We demonstrated this as part of the road to the AI Safety Summit, where we secured agreement from NGO, industry and international partners to take action to tackle AI-enabled child sexual abuse. The strongest protections in the Online Safety Act are for children, and all companies in scope of the legislation will need to tackle child sexual abuse material as a priority. Applications that use artificial intelligence will not be exempt and must incorporate robust guard-rails and safety measures to ensure that AI models and technology cannot be manipulated for child sexual abuse purposes.
Furthermore, I reassure noble Lords that the offence of taking, making, distributing and possessing with a view to distribution any indecent photograph or pseudophotograph of a child under the age of 18 carries a maximum sentence of 10 years’ imprisonment. Possession alone of indecent photographs or pseudophotographs of children can carry a maximum sentence of up to five years’ imprisonment.
However, I am not able to accept the amendment, as the current drafting would capture legitimate AI models that have been deliberately misused by offenders without the knowledge or intent of their creators to produce child sexual abuse material. It would also inadvertently criminalise individual users who possess perfectly legal digital files with no criminal intent, due to the fact that they could, when combined, enable the creation of child sexual abuse material.
I therefore ask the noble Baroness to withdraw the amendment, while recognising the strength of feeling and the strong arguments made on this issue and reiterating my offer to meet with her to discuss this ahead of Report.
I do not know how to express in parliamentary terms the depth of my disappointment, so I will leave that. Whoever helped the noble Viscount draft his response should be ashamed. We do not have a comprehensive system and the police do not have the capability; they came to me after months of trying to get the Home Office to act, so that is an untruth: the police do not have the capability.
I remind the noble Viscount that in previous debates his response on the bigger picture of AI has been to wait and see, but this is a here and now problem. As the noble Baroness, Lady Jones, set out, this would give purpose and reason—and here it is in front of us; we can act.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—
My Lords, I would be very happy to get a letter from the Minister.
I would be happy to write one. I will go for the abbreviated version of my speech.
I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.
The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.
Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.
Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.
I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.
Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.
For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.
My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.
Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.
The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.
We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.
I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.
I would be extremely happy for the Minister to write.
Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.
I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.
My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.
(8 months ago)
Grand CommitteeMy Lords, I listened carefully to the explanation given by the noble Lord, Lord Clement-Jones, for his stand part notice on Clause 44. I will have to read Hansard, as I may have missed something, but I am not sure I am convinced by his arguments against Clause 44 standing part. He described his stand part notice as “innocuous”, but I am concerned that if the clause were removed it would have a slightly wider implication than that.
We feel that there are some advantages to how Clause 44 is currently worded. As it stands, it simply makes it clear that data subjects have to use the internal processes to make complaints to controllers first, and then the controller has the obligation to respond without undue delay. Although this could place an extra burden on businesses to manage and reply to complaints in a timely manner, I would have thought that this was a positive step to be welcomed. It would require controllers to have clear processes in place for handling complaints; I hope that that in itself would be an incentive against their conducting the kind of unlawful processing that prompts complaints in the first place. This seems the best practice, which would apply anyway in most organisations and complaint and arbitration systems, including, perhaps, ombudsmen, which I know the noble Lord knows more about than I do these days. There should be a requirement to use the internal processes first.
The clause makes it clear that the data subject has a right to complain directly to the controller and it makes clear that the controller has an obligation to respond. Clause 45 then goes on to make a different point, which is that the commissioner has a right to refuse to act on certain complaints. We touched on this in an earlier debate. Clearly, to be in line with Clause 44, the controller would have to have finished handling the case within the allotted time. We agree with that process. However, an alternative reason for the commissioner to refuse is when the complaint is “vexatious or excessive”. We have rehearsed our arguments about the interpretation of those words in previous debates on the application of subject access requests. I do not intend to repeat them here, but our concern about that wording rightly remains. What is important here is that the ICO should not be able to reject complaints simply because the complainant is distressed or angry. It is helpful that the clause states that in these circumstances,
“the Commissioner must inform the complainant”
of the reasons it is considered vexatious or excessive. It is also helpful that the clause states that this
“does not prevent the complainant from making it a complaint again”,
presumably in a way more compliant with the rules. Unlike the noble Lord, Lord Clement Jones—as I said, I will look at what he said in more detail—on balance, we are content with the wording as it stands.
On a slightly different tack, we have added our name to Amendment 154, in the name of the noble Lord, Lord Clement-Jones, and we support Amendment 287 on a similar subject. This touches on a similar principle to our previous debate on the right of data communities to raise data-breach complaints on behalf of individuals. In these amendments, we are proposing that there should be a collective right for organisations to raise data-breach complaints for individuals or groups of individuals who do not necessarily feel sufficiently empowered or confident to raise the complaints on their own behalf. There are many reasons why this reticence might occur, not least that the individuals may feel that making a complaint would put their employment on the line or that they would suffer discrimination at work in the future. We therefore believe that these amendments are important to widen people’s access to work with others to raise these complaints.
Since these amendments were tabled, we have received the letter from the Minister that addresses our earlier debate on data communities. I am pleased to see the general support for data intermediaries that he set out in his letter. We argue that a data community is a separate distinct collective body, which is different from the wider concept of data intermediaries. This seems to be an area in which the ICO could take a lead in clarifying rights and set standards. Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue.
The noble Lord, Lord Clement-Jones, has tabled a number of amendments that modify the courts and tribunals functions. I was hoping that when I stood here and listened to him, I would understand a bit more about the issues. I hope he will forgive me for not responding in detail to these arguments. I do not feel that I know enough about the legal background to the concerns but he seems to have made a clear case in clarifying whether the courts or tribunals should have jurisdiction in data protection issues.
On that basis, I hope that the Minister will also provide some clarification on these issues and I look forward to his response.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for tabling these amendments to Clauses 44 and 45, which would reform the framework for data protection complaints to the Information Commissioner.
The noble Lord, Lord Clement-Jones, has given notice of his intention to oppose Clause 44 standing part of the Bill. That would remove new provisions from the Bill that have been carefully designed to provide a more direct route to resolution for data subjects’ complaints. I should stress that these measures do not limit rights for data subjects to bring complaints forward, but instead provide a more direct route to resolution with the relevant data controller. The measures formalise current best practice, requiring the complainant to approach the relevant data controller, where appropriate, to attempt to resolve the issue prior to regulatory involvement.
The Bill creates a requirement for data controllers to facilitate the making of complaints and look into what may have gone wrong. This should, in most cases, result in a much quicker resolution of data protection-related complaints. The provisions will also have the impact of enabling the Information Commissioner to redeploy resources away from handling premature complaints where such complaints may be dealt with more effectively, in the first instance, by controllers and towards value-added regulatory activity, supporting businesses to use data lawfully and in innovative ways.
The noble Lord’s Amendment 153 seeks, in effect, to expand the scope of the Information Commissioner’s duty to investigate complaints under Section 165 of the Data Protection Act. However, that Section of the Act already provides robust redress routes, requiring the commissioner to take appropriate steps to respond to complaints and offer an outcome or conclude an investigation within a specified period.
The noble Lord raised the enforcement of the UK’s data protection framework. I can provide more context on the ICO’s approach, although noble Lords will be aware that it is enforced independently of government by the ICO; it would of course be inappropriate for me to comment on how the ICO exercises its enforcement powers. The ICO aims to be fair, proportionate and effective, focusing on areas with the highest risk and most harm, but this does not mean that it will enforce every case that crosses its books.
The Government have introduced a new requirement on the ICO—Clause 43—to publish an annual report on how it has exercised its enforcement powers, the number and nature of investigations, the enforcement powers used, how long investigations took and the outcome of the investigations that ended in that period. This will provide greater transparency and accountability in the ICO’s exercise of its enforcement powers. For these reasons, I am not able to accept these amendments.
I also thank the noble Baroness and the noble Lord for their Amendments 154 and 287 concerning Section 190 of the Data Protection Act. These amendments would require the Secretary of State to legislate to give effect to Article 80(2) of the UK GDPR to enable relevant non-profit organisations to make claims against data controllers for alleged data breaches on behalf of data subjects, without those data subjects having requested or agreeing to the claim being brought. Currently, such non-profit organisations can already pursue such actions on behalf of individuals who have granted them specific authorisation, as outlined in Article 80(1).
In 2021, following consultation, the Government concluded that there was insufficient evidence to justify implementing Article 80(2) to allow non-profit organisations to bring data protection claims without the authorisation of the people affected. The Government’s response to the consultation noted that the regulator can and does investigate complaints raised by civil society groups, even when they are not made on behalf of named individuals. The ICO’s investigations into the use of live facial recognition technology at King’s Cross station and in some supermarkets in southern England are examples of this.
I also thank the noble Baroness, Lady Kidron, for raising her concerns about the protection of children throughout the debate—indeed, throughout all the days in Committee. The existing regime already allows civil society groups to make complaints to the ICO about data-processing activities that affect children and vulnerable people. The ICO has a range of powers to investigate systemic data breaches under the current framework and is already capable of forcing data controllers to take decisive action to address non-compliance. We are strengthening its powers in this Bill. I note that only a few member states of the EU have allowed non-governmental organisations to launch actions without a mandate, in line with the possibility provided by the GDPR.
I turn now to Amendments 154A, 154B—
Before the noble Lord gets there and we move too far from Amendment 154, where does the Government’s thinking leave us regarding a group of class actions? Trade unions take up causes on behalf of their membership at large. I guess, in the issue of the Post Office and Mr Bates, not every sub-postmaster or sub-postmistress would have signed up to that class action, even though they may have ended up being beneficiaries of its effects. So where does it leave people with regard to data protection and the way that the data protection scheme operates where there might be a class action?
If the action is raised on behalf of named individuals, those named individuals have to have given consent for that. If the action is for a general class of people, those people would not have to give their explicit consent, because they are not named in the action. Article 80(2) of the GDPR said that going that further step was optional for all member states. I do not know which member states have taken it up, but a great many have not, just because of the complexities to which it gives rise.
My Lords, just so that the Minister might get a little note, I will ask a question. He has explained what is possible—what can be done—but not why the Government still resist putting Article 80(2) into effect. What is the reason for not adopting that article?
The reason was that an extensive consultation was undertaken in 2021 by the Government, and the Government concluded at that time that there was insufficient evidence to take what would necessarily be a complex step. That was largely on the grounds that class actions of this type can go forward either as long as they have the consent of any named individuals in the class action or on behalf of a group of individuals who are unnamed and not specifically raised by name within the investigation itself.
Perhaps the Minister could in due course say what evidence would help to persuade the Government to adopt the article.
I want to help the Minister. Perhaps he could give us some more detail on the nature of that consultation and the number of responses and what people said in it. It strikes me as rather important.
Fair enough. Maybe for the time being, it will satisfy the Committee if I share a copy of that consultation and what evidence was considered, if that would work.
I will turn now to Amendments 154A to 155 and Amendment 175, which propose sweeping modifications to the jurisdiction of the court and tribunal for proceedings under the Data Protection Act 2018. These amendments would have the effect of making the First-tier Tribunal and Upper Tribunal responsible for all data protection cases, transferring both ongoing and future cases out of the court system and to the relevant tribunals.
The Government of course want to ensure that proceedings for enforcement of data protection rules, including redress routes available to data subjects, are appropriate for the nature of the complaint. As the Committee will be well aware, at present there is a mixture of jurisdiction for tribunals and courts under data protection legislation, depending on the precise nature of the proceedings in question. Tribunals are indeed the appropriate venue for some data protection proceedings, and the legislation already recognises that—for example, for application by data subjects for an order requiring the ICO to progress their complaint. However, courts are generally the more appropriate venue for cases involving claims for compensation and successful parties can usually recover their costs. Courts also apply stricter rules of procedure and evidence than tribunals. That is because some cases are appropriate to fall under the jurisdiction of the tribunal, while others are more appropriate for court jurisdiction. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensatory damages for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in accordance with its strict procedural and evidential rules, where the data subject may recover their costs if successful.
As such, the Government are confident that the current system is balanced and proportionate and provides clear and effective administrative and judicial redress routes for data subjects seeking to exercise their rights.
My Lords, is the Minister saying that there is absolutely no confusion between the jurisdiction of the tribunals and the courts? That is, no court has come to a different conclusion about jurisdiction—for example, as to whether procedural matters are for tribunals and merits are for courts or vice versa. Is he saying that everything is hunky-dory and clear and that we do not need to concern ourselves with this crossover of jurisdiction?
No, as I was about to say, we need to take these issues seriously. The noble Lord raised a number of specific cases. I was unfamiliar with them at the start of the debate—
I will go away and look at those; I look forward to learning more about them. There are obvious implications in what the noble Lord said as to the most effective ways of distributing cases between courts and other channels.
For these reasons, I hope that the noble Lord will withdraw his amendment.
I am intrigued by the balance between what goes to a tribunal and what goes to the courts. I took the spirit behind the stand-part notice in the name of the noble Lord, Lord Clement-Jones, as being about finding the right place for the right case and ensuring that the wheels of justice are much more accessible. I am not entirely persuaded by what the Minister has said. It would probably help the Committee if we had a better understanding of where the cases go, how they are distributed and on what basis.
I thank the noble Lord; that is an important point. The question is: how does the Sorting Hat operate to distribute cases between the various tribunals and the court system? We believe that the courts have an important role to play in this but it is about how, in the early stages of a complaint, the case is allocated to a tribunal or a court. I can see that more detail is needed there; I would be happy to write to noble Lords.
Before we come to the end of this debate, I just want to raise something. I am grateful to the Minister for offering to bring forward the 2021 consultation on Article 80(2)—that will be interesting—but I wonder whether, as we look at the consultation and seek to understand the objections, the Government would be willing to listen to our experiences over the past two or three years. I know I said this on our previous day in Committee but there is, I hope, some point in ironing out some of the problems of the data regime that we are experiencing in action. I could bring forward a number of colleagues on that issue and on why it is a blind spot for both the ICO and the specialist organisations that are trying to bring systemic issues to its attention. It is very resource-heavy. I want a bit of goose and gander here: if we are trying to sort out some of the resourcing and administrative nightmares in dealing with the data regime, from a user perspective, perhaps a bit of kindness could be shown to that problem as well as to the problem of business.
I would be very happy to participate in that discussion, absolutely.
My Lords, I thank the Minister for his response. I have surprised myself: I have taken something positive away from the Bill.
The noble Baroness, Lady Jones, was quite right to be more positive about Clause 44 than I was. The Minister unpacked its relationship with Clause 45 well and satisfactorily. Obviously, we will read Hansard before we jump to too positive a conclusion.
On Article 80(2), I am grateful to the Minister for agreeing both to go back to the consultation and to look at the kinds of evidence that were brought forward, because this is a really important aspect for many civil society organisations. He underestimates the difficulties faced when bringing complaints of this nature. I would very much like this conversation to go forward because this issue has been quite a bone of contention; the noble Baroness, Lady Kidron, remembers that only too well. We may even have had ping-pong on the matter back in 2017. There is an appetite to keep on the case so, the more we can discuss this matter—between Committee and Report in particular—the better, because there is quite a head of steam behind it.
As far as the jurisdiction point is concerned, I think this may be the first time I have heard a Minister talk about the Sorting Hat. I was impressed: I have often compared this place to Hogwarts but the concept of using the Sorting Hat to decide whether a case goes to a tribunal or a court is a wonderful one. You would probably need artificial intelligence to do that kind of thing nowadays; that in itself is a bit of an issue because, after all, these may be elaborate amendments but, as the noble Lord, Lord Bassam, said, the case being made here is about the possibility of there being confusion and things not being clear in terms of where jurisdiction lies. It is really important that we determine whether the courts and tribunals themselves understand this and, perhaps more appropriately, whether they have differing views about it.
We need to get to grips with this; the more the Minister can dig into it, and into Delo, Killock and so on, the better. We are all in the foothills here but I am certainly not going to try to unpack those two judgments and the differences between Mrs Justice Farbey and Mr Justice Mostyn, which are well beyond my competency. I thank the Minister.
My Lords, the UK has rightly moved away from the EU concept of supremacy, under which retained EU law would always take precedence over domestic law when they were in conflict. That is clearly unacceptable now that we have left the EU. However, we understand that the effective functioning of our data protection legislation is of critical importance and it is appropriate for us to specify the appropriate relationship between UK and EU-derived pieces of legislation following implementation of the Retained EU Law (Revocation and Reform) Act, or REUL. That is why I am introducing a number of specific government amendments to ensure that the hierarchy of legislation works in the data protection context. These are Amendments 156 to 164 and 297.
Noble Lords may be aware that Clause 49 originally sought to clarify the relationship between the UK’s data protection legislation, specifically the UK GDPR and EU-derived aspects of the Data Protection Act 2018, and future data processing provisions in other legislation, such as powers to share or duties to disclose personal data, as a result of some legal uncertainty created by the European Union (Withdrawal) Act 2018. To resolve this uncertainty, Clause 49 makes it clear that all new data processing provisions in legislation should be read consistently with the key requirements of the UK data protection legislation unless it is expressly indicated otherwise. Since its introduction, the interpretation of pre-EU exit legislation has been altered and there is a risk that this would produce the wrong effect in respect of the interpretation of existing data processing provisions that are silent about their relationship with the data protection legislation.
Amendment 159 will make it clear that the full removal of the principle of EU law supremacy and the creation of a reverse hierarchy in relation to assimilated direct legislation, as provided for in the REUL Act, do not change the relationship between the UK data protection legislation and existing legislation that is in force prior to commencement of Clause 49(2). Amendment 163 makes a technical amendment to the EU withdrawal Act, as amended, to support this amendment.
Amendment 162 is similar to the previous amendment but it concerns the relationship between provisions relating to certain obligations and rights under data protection legislation and on restrictions and prohibitions on the disclosure of information under other existing legislation. Existing Section 186 of the Data Protection Act 2018 governs this relationship. Amendment 162 makes it clear that the relationship between these two types of provision is not affected by the changes to the interpretation of legislation that I have already referred to made by the REUL Act. Additionally, it clarifies that, in relation to pre-commencement legislation, Section 186(1) may be disapplied expressly or impliedly.
Amendment 164 relates to the changes brought about by the REUL Act and sets out that the provisions detailed in earlier Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act.
Amendment 297 provides a limited power to remove provisions that achieve the same effect as new Section 183A from legislation made or passed after this Bill receives Royal Assent, as their presence could cause confusion.
Finally, Amendments 156 and 157 are consequential. Amendments 158, 160 and 161 are minor drafting changes made for consistency, updating and consequential purposes.
Turning to the amendments introduced by the noble Lord, Lord Clement-Jones, I hope that he can see from the government amendments to Clause 49 that we have given a good deal of thought to the impact of the REUL Act 2023 on the UK’s data protection framework and have been prepared to take action on this where necessary. We have also considered whether some of the changes made by the REUL Act could cause confusion about how the UK GDPR and the Data Protection Act 2018 interrelate. Following careful analysis, we have concluded that they would largely continue to be read alongside each other in the intended way, with the rules of the REUL Act unlikely to interfere with this. Any new general rule such as that suggested by the noble Lord could create confusion and uncertainty.
Amendments 168 to 170, 174, 174A and 174B seek to reverse changes introduced by the REUL Act at the end of 2023, specifically the removal of EU general principles from the statute book. EU general principles and certain EU-derived rights had originally been retained by the European Union (Withdrawal) Act to ensure legal continuity at the end of the transition period, but this was constitutionally novel and inappropriate for the long term.
The Government’s position is that EU law concepts should not be used to interpret domestic legislation in perpetuity. The REUL Act provided a solution to this by repealing EU general principles from UK law and clarifying the approach to be taken domestically. The amendments tabled by the noble Lord, Lord Clement-Jones, would undo this important work by reintroducing to the statute book references to rights and principles which have not been clearly defined and are inappropriate now that we have left the EU.
The protection of personal data already forms part of the protection offered by the European Convention on Human Rights, under the Article 8 right to respect for private and family life, and is further protected by our data protection legislation. The UK GDPR and the Data Protection Act 2018 provide a comprehensive set of rules for organisations to follow and rights for people in relation to the use of their data. Seeking to apply an additional EU right to data protection in UK law would not significantly affect the way the data protection framework functions or enhance the protections it affords to individuals. Indeed, doing so may well add unnecessary uncertainty and complexity.
Amendments 171 to 173 pertain to exemptions to specified data subject rights and obligations on data controllers set out in Schedules 2 to 4 to the DPA 2018. The 36 exemptions apply only in specified circumstances and are subject to various safeguards. Before addressing the amendments the noble Lord has tabled, it is perhaps helpful to set out how these exemptions are used. Personal data must be processed according to the requirements set out in the UK GDPR and the DPA 2018. This includes the key principles of lawfulness, fairness and transparency, data minimisation and purpose limitation, among others. The decision to restrict data subjects’ rights, such as the right to be notified that their personal data is being processed, or limit obligations on the data controller, comes into effect only if and when the decision to apply an exemption is taken. In all cases, the use of the exemption must be both necessary and proportionate.
One of these exemptions, the immigration exemption, was recently amended in line with a court ruling that found it was incompatible with the requirements set out in Article 23. This exemption is used by the Home Office. The purpose of Amendments 171 to 173 is to extend the protections applied to the immigration exemption across the other exemptions subject to Article 23, apart from in Schedule 4, where the requirement to consider whether its application prejudices the relevant purposes is not considered relevant.
The other exemptions are each used in very different circumstances, by different data controllers—from government departments to SMEs—and work by applying different tests that function in a wholly different manner from the immigration exemption. This is important to bear in mind when considering these broad-brush amendments. A one-size-fits-all approach would not work across the exemption regime.
It is the Government’s position that any changes to these important exemptions should be made only after due consideration of the circumstances of that particular exemption. In many cases, these amendments seek to make changes that run counter to how the exemption functions. Making changes across the exemptions via this Bill, as the noble Lord’s amendments propose, has the potential to have significant negative impacts on the functioning of the exemptions regime. Any potential amendments to the other exemptions would require careful consideration. The Government note that there is a power to make changes to the exemptions in the DPA 2018, if deemed necessary.
For the reasons I have given, I look forward to hearing more from the noble Lord on his amendments, but I hope that he will not press them. I beg to move.
My Lords, I thank the Minister for that very careful exposition. I feel that we are heavily into wet towel, if not painkiller, territory here, because this is a tricky area. As the Minister might imagine, I will not respond to his exposition in detail, at this point; I need to run away and get some external advice on the impact of what he said. He is really suggesting that the Government prefer a pick ‘n’ mix approach to what he regards as a one size fits all. I can boil it down to that. He is saying that you cannot just apply the rules, in the sense that we are trying to reverse some of the impacts of the previous legislation. I will set out my stall; no doubt the Minister and I, the Box and others, will read Hansard and draw our own conclusions at the end, because this is a complicated area.
Until the end of 2023, the Data Protection Act 2018 had to be read compatibly with the UK GDPR. In a conflict between the two instruments, the provisions of the UK GDPR would prevail. The reversing of the relationship between the 2018 Act and the UK GDPR, through the operation of the Retained EU Law (Revocation and Reform) Act—REUL, as the Minister described it—has had the effect of lowering data protection rights in the UK. The case of the Open Rights Group and the3million v the Secretary of State for the Home Office and the Secretary of State for Digital, Culture, Media and Sport was decided after the UK had left the EU, but before the end of 2023. The Court of Appeal held that exemptions from data subject rights in an immigration context, as set out in the Data Protection Act, were overly broad, contained insufficient safeguards and were incompatible with the UK GDPR. The court disapplied the exemptions and ordered the Home Office to redraft them to include the required safeguards. We debated the regulations the other day, and many noble Lords welcomed them on the basis that they had been revised for the second time.
This sort of challenge is now not possible, because the relationship between the DPA and the UK GDPR has been turned on its head. If the case were brought now, the overly broad exemptions in the DPA would take precedence over the requirement for safeguards set out in the UK GDPR. These points were raised by me in the debate of 12 December, when the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 were under consideration. In that debate, the noble Baroness, Lady Swinburne, stated that
“we acknowledge the importance of making sure that data processing provisions in wider legislation continue to be read consistently with the data protection principles in the UK GDPR … Replication of the effect of UK GDPR supremacy is a significant decision, and we consider that the use of primary legislation is the more appropriate way to achieve these effects, such as under Clause 49 where the Government consider it appropriate”.—[Official Report, 12/12/23; col. GC 203.]
This debate on Clause 49 therefore offers an opportunity to reinstate the previous relationship between the UK GDPR and the Data Protection Act. The amendment restores the hierarchy, so that it guarantees the same rights to individuals as existed before the end of 2023, and avoids unforeseen consequences by resetting the relationship between the UK GDPR and the DPA 2018 to what the parliamentary draftsmen intended when the Act was written. The provisions in Clause 49, as currently drafted, address the relationship between domestic law and data protection legislation as a whole, but the relationship between the UK GDPR and the DPA is left in its “reversed” state. This is confirmed in the Explanatory Notes to the Bill at paragraph 503.
The purpose of these amendments is to restore data protection rights in the UK to what they were before the end of 2023, prior to the coming into force of REUL. The amendments would restore the fundamental right to the protection of personal data in UK law; ensure that the UK GDPR and the DPA continue to be interpreted in accordance with the fundamental right to the protection of personal data; ensure that there is certainty that assimilated case law that references the fundamental right to the protection of personal data still applies; and apply the protections required in Article 23 of the UK GDPR to all the relevant exemptions in Schedule 2 to the Data Protection Act. This is crucial in avoiding diminishing trust in our data protection frameworks. If people do not trust that their data is protected, they will refuse to share it. Without this data, new technologies cannot be developed, because these technologies rely on personal data. By creating uncertainty and diminishing standards, the Government are undermining the very growth in new technologies that they want.
My Lords, I have looked at the government amendments in this group and have listened very carefully to what the Minister has said—that it is largely about interpretation. There are no amendments that I wish to comment on, save to say that they seem to be about consistency of language and bringing in part EU positions into UK law. They seem also to be about consistency of meaning, and for the most part the intention seems to be to ensure that nothing in EU retained law undoes the pre-existing legal framework.
However, I would appreciate the Minister giving us a bit more detail on the operation of Amendment 164. Amendment 297 seems to deal with a duplication issue, so perhaps he can confirm for the Committee that this is the case. We have had swathes of government amendments of a minor and technical nature, largely about chasing out gremlins from the drafting process. Can he confirm that this is the case and assure the Committee that we will not be left with any nasty surprises in the drafting that need correction at a later date?
The amendments tabled in the name of the noble Lord, Lord Clement-Jones, are of course of a different order altogether. The first two—Amendments 165 and 166—would restore the relationship between the UK GDPR and the 2018 Act and the relevant provisions of the Retained EU Law (Revocation and Reform) Act 2023. Amendment 168 would ensure that assimilated case law referring to the European Charter of Fundamental Rights would still be relevant in interpreting the UK GDPR. It would give greater certainty in how the UK’s data protection framework is interpreted. Amendment 169 would ensure that the interpretation is carried over from the UK GDPR and 2018 legislation in accordance with the general principle of the protection of personal data.
The noble Lord’s Amendments 170 to 174B would bring back into law protections that existed previously when UK law was more closely aligned with EU law and regulation. There is also an extension of the EU data protection of personal data to the assimilated standard that existed by virtue of Section 4 of the European Union (Withdrawal) Act 2018. I can well understand the noble Lord’s desire to take the UK back to a position where we are broadly in the same place in terms of protections as our former EU partners. First, having—broadly speaking—protections that are common across multiple jurisdictions makes it easier and simpler for companies operating in those markets. Secondly, from the perspective of data subjects, it is much easier to comprehend common standards of data protection and to seek redress when required. The Government, for their part, will no doubt argue that there is some sort of big Brexit benefit in this, although I think that advisers and experts are divided on the degree of that benefit, and indeed who benefits.
Later, we will get to discuss data adequacy standards. Concern exists in some quarters as to whether we have this right and what this legislative opportunity might be missing to ensure that the UK meets those international standards that the EU requires. That is a debate for later, but we are broadly sympathetic to the desire of the noble Lord, Lord Clement-Jones, to find the highest level of protection for UK citizens. That is the primary motivation for many of the amendments and debates that we have had today. We do not want to weaken what were previously carefully crafted and aligned protections. I do not entirely buy the argument that the Minister made earlier about this group of amendments causing legal uncertainty. I believe it is the reverse of that: the noble Lord, Lord Clement-Jones, is trying to provide greater certainty and a degree of jurisdictional uniformity.
I hope that I have understood what the noble Lord is trying to achieve here. For those reasons, we will listen to the Minister’s concluding comments—and read Hansard—very carefully.
I thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their comments. As the noble Lord, Lord Clement-Jones, points out, it is a pretty complex and demanding area, but that in no way diminishes the importance of getting it right. I hope that in my remarks I can continue that work, but of course I am happy to discuss this: it is a very technical area and, as all speakers have pointed out, it is crucial for our purposes that it be executed correctly.
While the UK remains committed to strong protections for personal data through the UK GDPR and Data Protection Act, it is important that it is able to diverge from the EU legislation where this is appropriate for the UK. We have carefully assessed the effects of EU withdrawal legislation and the REUL Act and are making adjustments to ensure that the right effect is achieved. The government amendments are designed to ensure legal certainty and protect the coherence of the data protection framework following commencement of the REUL Act—for example, by maintaining the pre-REUL Act relationship in certain ways between key elements of the UK data protection legislation and other existing legislation.
The purpose of the REUL Act is to ensure that the UK has control over its laws. Resurrecting the principle of EU law supremacy in its entirety or continuing to apply case law principles is not consistent with the UK’s departure from the EU and taking back control over our own laws. These amendments make it clear that changes made to the application of the principle of EU law supremacy and new rules relating to the interpretation of direct assimilated legislation under the REUL Act do not have any impact on existing provisions that involve the processing of personal data.
The noble Lord, Lord Bassam, asked for more detail about Amendment 164. It relates to changes brought about by the REUL Act and sets out that the provisions detailed in Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act. The retrospective effect of this provision addresses the gap between the commencement of the REUL Act 2023 and the Data Protection and Digital Information Bill.
On the immigration exemption case, I note that it was confined to the immigration exemption and did not rule on the other exemptions. The Government will continue to keep the exemptions under review and, should it be required, the Government have the power to amend the other exemptions using an existing power in the DPA 2018. Before doing so, of course the Government would want to ensure that due consideration is given to how the particular exemptions are used. Meanwhile, I thank noble Lords for what has been a fascinating, if demanding, debate.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Jones, and my noble friend Lord Kamall for their amendments. To address the elephant in the room first, I can reassure noble Lords that the use of digital identity will not be mandatory, and privacy will remain one of the guiding principles of the Government’s approach to digital identity. There are no plans to introduce a centralised, compulsory digital ID system for public services, and the Government’s position on physical ID cards remains unchanged. The Government are committed to realising the benefits of digital identity technologies without creating ID cards.
I shall speak now to Amendment 177, which would require the rules of the DVS trust framework to be set out in regulations subject to the affirmative resolution procedure. I recognise that this amendment, and others in this group, reflect recommendations from the DPRRC. Obviously, we take that committee very seriously, and we will respond to that report in due course, but ahead of Report.
Part 2 of the Bill will underpin the DVS trust framework, a document of auditable rules, which include technical standards. The trust framework refers to data protection legislation and ICO guidance. It has undergone four years of development, consultation and testing within the digital identity market. Organisations can choose to have their services certified against the trust framework to prove that they provide secure and trustworthy digital verification services. Certification is provided by independent conformity assessment bodies that have been accredited by the UK Accreditation Service. Annual reviews of the trust framework are subject to consultation with the ICO and other appropriate persons.
Requiring the trust framework to be set out in regulations would make it hard to introduce reactive changes. For example, if a new cybersecurity threat emerged which required the rapid deployment of a fix across the industry, the trust framework would need to be updated very quickly. Developments in this fast-growing industry require an agile approach to standards and rule-making. We cannot risk the document becoming outdated and losing credibility with industry. For these reasons, the Government feel that it is more appropriate for the Secretary of State to have the power to set the rules of the trust framework with appropriate consultation, rather than for the power to be exercised by regulations.
I turn to Amendments 178 to 195, which would require the fees that may be charged under this part of the Bill to be set out in regulations subject to the negative resolution procedure. The Government have committed to growing a market of secure and inclusive digital identities as an alternative to physical proofs of identity, for those that choose to use them. Fees will be introduced only once we are confident that doing so will not restrict the growth of this market, but the fee structure, when introduced, is likely to be complex and will need to flex to support growth in an evolving market.
There are built-in safeguards to this fee-charging power. First, there is a strong incentive for the Secretary of State to set fees that are competitive, fair and reasonable, because failing to do so would prevent the Government realising their commitment to grow this market. Secondly, these fee-raising powers have a well-defined purpose and limited scope. Thirdly, the Secretary of State will explain in advance what fees she intends to charge and when she intends to charge them, which will ensure the appropriate level of transparency.
The noble Baroness, Lady Jones, asked about the arrangements for the office for digital identities and attributes. It will not initially be independent, as it will be located within the Department for Science, Innovation and Technology. As we announced in the government response to our 2021 consultation, we intend for this to be an interim arrangement until a suitable long-term home for the governing body can be identified. Delegating the role of Ofdia—as I suppose we will call it—to a third party in the future, is subject to parliamentary scrutiny, as provided for by the clauses in the Bill. Initially placing Ofdia inside government will ensure that its oversight role could mature in the most effective way and that it supports the digital identity market in meeting the needs of individual users, relying parties and industry.
Digital verification services are independently certified against the trust framework rules by conformity assessment bodies. Conformity assessment bodies are themselves independently accredited by the UK Accreditation Service to ensure that they have the competence and impartiality to perform certification. The trust framework certification scheme will be accredited by the UK Accreditation Service to give confidence that the scheme can be efficiently and competently used to certify products, processes and services. All schemes will need to meet internationally agreed standards set out by the UK Accreditation Service. Ofdia, as the owner of the main code, will work with UKAS to ensure that schemes are robust, capable of certification and operated in line with the trust framework.
Amendment 184A proposes to exclude certified public bodies from registering to provide digital verification services. The term “public bodies” could include a wide range of public sector entities, including institutions such as universities, that receive any public funding. The Government take the view that this exclusion would be unnecessarily restrictive in the UK’s nascent digital identity market.
Amendment 195ZA seeks to mandate organisations to implement a non-digital form of verification in every instance where a digital method is required. The Bill enables the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them, nor does it insist that businesses which currently accept non-digital methods of verification must transition to digital methods. As Clause 52 makes clear, digital verification services are services that are provided at the request of the individual. The purpose of the Bill is to ensure that, when people want to use a digital verification service, they know which of the available products and services they can trust.
Some organisations operate only in the digital sphere, such as online-only banks and energy companies. To oblige such organisations to offer manual document checking would place obligations on them that would go beyond the Government’s commitment to do only what is necessary to enable the digital identity market to grow. In so far as this amendment would apply to public authorities, the Equality Act requires those organisations to consider how their services will affect people with protected characteristics, including those who, for various reasons, might not be able or might choose not to use a digital identity product.
Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?
On this point, the argument that the Government are making is that, where consumers want to use a digital verification service, all the Bill does is to provide a mechanism for those DVSs to be certified and assured to be safe. It does not seek to require anything beyond that, other than creating a list of safe DVSs.
The Equality Act applies to the public sector space, where it needs to be followed to ensure that there is an absolute right to inclusive access to digital technologies.
My Lords, in essence, the Minister is admitting that there is a gap when somebody who does not have access to digital services needs an identity to deal with the private sector. Is that right?
In the example I gave, I was not willing to use a digital system to provide a guarantee for my son’s accommodation in the private sector. I understand that that would not be protected and that, therefore, someone might not be able to rent a flat, for example, because they cannot provide physical ID.
The Bill does not change the requirements in this sense. If any organisation chooses to provide its services on a digital basis only, that is up to that organisation, and it is up to consumers whether they choose to use it. It makes no changes to the requirements in that space.
I will now speak to the amendment that seeks to remove Clause 80. Clause 80 enables the Secretary of State to ask accredited conformity assessment bodies and registered DVS providers to provide information which is reasonably required to carry out her functions under Part 2 of the Bill. The Bill sets out a clear process that the Secretary of State must follow when requesting this information, as well as explicit safeguards for her use of the power. These safeguards will ensure that DVS providers and conformity assessment bodies have to provide only information necessary for the functioning of this part of the Bill.
My Lords, the clause stand part amendment was clearly probing. Does the Minister have anything to say about the relationship with OneLogin? Is he saying that it is only information about systems, not individuals, which does not feed into the OneLogin identity system that the Government are setting up?
It is very important that the OneLogin system is entirely separate and not considered a DVS. We considered whether it should be, but the view was that that comes close to mandating a digital identity system, which we absolutely want to avoid. Hence the two are treated entirely differently.
That is a good reassurance, but if the Minister wants to unpack that further by correspondence, I would be very happy to have that.
I am very happy to do so.
I turn finally to Amendments 289 and 300, which aim to introduce a criminal offence of digital identity theft. The Government are committed to tackling fraud and are confident that criminal offences already exist to cover the behaviour targeted by these amendments. Under the Fraud Act 2006, it is a criminal offence to make a gain from the use of another person’s identity or to cause or risk a loss by such use. Where accounts or databases are hacked into, the Computer Misuse Act 1990 criminalises the unauthorised access to a computer programme or data held on a computer.
Furthermore, the trust framework contains rules, standards and good practice requirements for fraud monitoring and responding to fraud. These rules will further defend systems and reduce opportunities for digital identity theft.
My Lords, I am sorry, but this is a broad-ranging set of amendments, so I need to intervene on this one as well. When the Minister does his will write letter in response to today’s proceedings, could he tell us what guidance there is to the police on this? Because when the individual, Mr Arron, approached the police, they said, “Oh, sorry, there’s nothing we can do; identity theft is not a criminal offence”. The Minister seems to be saying, “No, it is fine; it is all encompassed within these provisions”. While he may be saying that, and I am sure he will be shouting it from the rooftops in the future, the question is whether the police have guidance; does the College of Policing have guidance and does the Home Office have guidance? The ordinary individual needs to know that it is exactly as the Minister says, and identity theft is covered by these other criminal offences. There is no point in having those offences if nobody knows about them.
That is absolutely fair enough: I will of course write. Sadly, we are not joined today by ministerial colleagues from the Home Office, who have some other Bill going on.
I have no doubt that its contribution to the letter will be equally enjoyable. However, for all the reasons I set out above, I am not able to accept these amendments and respectfully encourage the noble Baroness and noble Lords not to press them.
My Lords, I suppose I am meant to say that I thank the Minister for his response, but I cannot say that it was particularly optimistic or satisfying. On my amendments, the Minister said he would be responding to the DPRRC in due course, and obviously I am interested to see that response, but as the noble Lord, Lord Clement-Jones, said, the committee could not have been clearer and I thought made a very compelling case for why there should be some parliamentary oversight of this main code and, indeed, the fees arrangements.
I understand that it is a fast-moving sector, but the sort of things that the Delegated Powers Committee was talking about was that the main code should have some fundamental principles, some user rights and so on. We are not trying to spell out every sort of service that is going to be provided—as the Minister said, it is a fast-moving sector—but people need to have some trust in it and they need to know what this verification service is going to be about. Just saying that there is going to be a code, on such an important area, and that the Secretary of State will write it, is simply not acceptable in terms of basic parliamentary democracy. If it cannot be done through an affirmative procedure, the Government need to come up with another way to make sure that there is appropriate parliamentary input into what is being proposed here.
On the subject of the fees, the Delegated Powers Committee and our amendment was saying only that there should be a negative SI. I thought that was perfectly reasonable on its part and I am sorry that the Minister is not even prepared to accept that perfectly suggestion. All in all, I thought that the response on that element was very disappointing.
The response was equally disappointing on the whole issue that the noble Lords, Lord Kamall and Lord Vaux, raised about the right not to have to use the digital verification schemes but to do things on a non-digital basis. The arguments are well made about the numbers of people who are digitally excluded. I was in the debate that the noble Lord referred to, and I cannot remember the statistics now, but something like 17% of the population do not have proper digital access, so we are excluding a large number of people from a whole range of services. It could be applying for jobs, accessing bank accounts or applying to pay the rent for your son’s flat or whatever. We are creating a two-tier system here, for those who are involved and those who are on the margins who cannot use a lot of the services. I would have hoped that the Government would have been much more engaged in trying to find ways through that and providing some guarantees to people.
We know that we are taking a big leap, with so many different services going online. There is a lot of suspicion about how these services are going to work and people do not trust that computers are always as accurate as we would like them to be, so they would like to feel that there is another way of doing it if it all goes wrong. It worries me that the Minister is not able to give that commitment.
I have to say that I am rather concerned by what the Minister said about the private sector—in effect, that it can already have a requirement to have digital only. Surely, in this brave new world we are going towards, we do not want a digital-only service; this goes back to the point about a whole range of people being excluded. What is wrong with saying, even to people who collect people’s bank account details to pay their son’s rent, “There is an alternative way of doing this as well as you providing all the information digitally”? I am very worried about where all this is going, including who will be part of it and who will not. If the noble Lords, Lord Kamall and Lord Vaux, wish to pursue this at a later point, I would be sympathetic to their arguments.
On identity theft, the noble Lord, Lord Clement-Jones, made a compelling case. The briefing that he read out from the Metropolitan Police said that your data is one of your most valuable assets, which is absolutely right. He also rightly made the point that this is linked to organised crime. It does not happen by accident; some major people are farming our details and using them for all sorts of nefarious activities. There is a need to tighten up the regulation and laws on this. The Minister read out where he thinks this is already dealt with under existing legislation but we will all want to scrutinise that and see whether that really is the case. There are lots of examples of where the police have not been able to help people and do not know what their rights are, so we just need to know exactly what advice has been given to the police.
I feel that the Minister could have done more on this whole group to assure us that we are not moving towards a two-tier world. I will withdraw my amendment, obviously, but I have a feeling that we will come back to this issue; it may be something that we can talk to the Minister about before we get to Report.
My Lords, I thank the noble Baronesses, Lady Bennett, Lady Young of Old Scone and Lady Jones, for their proposed amendments on extending the definition of business data in smart data schemes, the disclosure of climate and nature information to improve public service delivery and the publication of an EU adequacy risk assessment.
On Amendment 195A, we consider that information about the carbon and energy intensity of goods, services or digital content already falls within the scope of “business data” as information about goods, services and digital content supplied or provided by a trader. Development of smart data schemes will, where relevant, be informed by—among other things—the Government’s Environmental Principles Policy Statement, under the Environment Act 2021.
With regard to Amendment 218, I thank the noble Baroness, Lady Young of Old Scone, for her sympathies; they are gratefully received. I will do my best in what she correctly pointed out is quite a new area for me. The powers to share information under Part 5 of the Digital Economy Act 2017—the DEA—are supplemented by statutory codes of practice. These require impact assessments to be carried out, particularly for significant changes or proposals that could have wide-ranging effects on various sectors or stakeholders. These impact assessments are crucial for understanding the implications of the Digital Economy Act and ensuring that it achieves its intended objectives, while minimising any negative consequences for individuals, businesses and society as a whole. As these assessments already cover economic, social and environmental impact, significant changes in approach are already likely to be accounted for. This is in addition to the duty placed on Ministers by the Environment Act 2021 to have due regard to the Environmental Principles Policy Statement.
Lastly, turning to Amendment 296, the Government are committed to maintaining their data adequacy decisions from the EU, which we absolutely recognise play a pivotal role in enabling trade and fighting crime. As noble Lords alluded to, we maintain regular engagement with the European Commission on the Bill to ensure that our reforms are understood.
The EU adequacy assessment of the UK is, of course, a unilateral, autonomous process for the EU to undertake. However, we remain confident that our reforms deliver against UK interests and are compatible with maintaining EU adequacy. As the European Commission itself has made clear, a third country—the noble Lord, Lord Clement-Jones, alluded to this point—is not required to have the same rules as the EU to be considered adequate. Indeed, 15 countries have EU adequacy, including Japan, Israel and the Republic of Korea. All these nations pursue independent and, often, more divergent approaches to data protection.
The Government will provide both written and oral evidence to the House of Lords European Affairs Committee inquiry on UK-EU data adequacy and respond to its final report, which is expected to be published in the summer. Many expert witnesses already provided evidence to the committee and have stated that they believe that the Bill is compatible with maintaining adequacy.
As noble Lords have noted, the Government have published a full impact assessment alongside the Bill, which sets out in more detail what both the costs and financial benefits of the Bill would be—including in the unlikely scenario of the EU revoking the UK’s adequacy decision. I also note that UK adequacy is good for the EU too: every EU company, from multinationals to start-ups, with customers, suppliers or operations in the UK relies on EU-UK data transfers. Leading European businesses and organisations have consistently emphasised the importance of maintaining these free flows of data to the UK.
For these reasons, I hope that the noble Baronesses will agree to withdraw or not move these amendments.
The Minister made the point at the end there that it is in the EU’s interest to agree to our data adequacy. That is an important point but is that what the Government are relying on—the fact that it is in the EU’s interest as much as ours to continue to agree to our data adequacy provisions? If so, what the Minister has said does not make me feel more reassured. If the Government are relying on just that, it is not a particularly strong argument.
My Lords, can I point out, on the interests of the EU, that it does not go just one way? There is a question around investment as well. For example, any large bank that is currently running a data-processing facility in this country that covers the whole of Europe may decide, if we lose data adequacy, to move it to Europe. Anyone considering setting up such a thing would probably go for Europe rather than here. There is therefore an investment draw for the EU here.
I do not know what I could possibly have said to create the impression that the Government are flying blind on this matter. We continue to engage extensively with the EU at junior official, senior official and ministerial level in order to ensure that our proposed reforms are fully understood and that there are no surprises. We engage with multiple expert stakeholders from both the EU side and the UK side. Indeed, as I mentioned earlier, a number of experts have submitted evidence to the House’s inquiry on EU-UK data adequacy and have made clear their views that the DPDI reforms set out in this Bill are compatible with EU adequacy. We continue to engage with the EU throughout. I do not want to be glib or blithe about the risks; we recognise the risks but it is vital—
Could we have a list of the people the noble Lord is talking about?
Yes. I would be happy to provide a list of the people we have spoken to about adequacy; it may be a long one. That concludes the remarks I wanted to make, I think.
Perhaps the Minister could just tweak that a bit by listing not just the people who have made positive noises but those who have their doubts.
I thank my noble friend Lord Holmes, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, as well as other co-signatories for detailed examination of the Bill through these amendments.
I begin by addressing Amendments 197A, 197B and 197C tabled by my noble friend Lord Holmes, which seek to establish a biometrics office responsible for overseeing biometric data use, and place new obligations on organisations processing such data. The Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, and these functions will continue to sit with the new information commission, once established. For example, in March 2023 it investigated the use of live facial recognition in a retail security setting by Facewatch. In February 2024, it took action against Serco Leisure in relation to its use of biometric data to monitor attendance of leisure centre employees.
Schedule 15 to this Bill will also enable the information commission to establish committees of external experts with skills in any number of specialist areas, including biometrics, to provide specialist advice to the commission. Given that the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, the Government are therefore of the firm view that the information commission is best placed to continue to oversee the processing of biometric data. The Bill also allows the new information commission to establish specialist committees and require them to provide the commission with specialist advice. The committees may include specialists from outside the organisation, with key skills and expertise in specific areas, including biometrics.
The processing of biometric data for the purpose of uniquely identifying an individual is also subject to heightened safeguards, and organisations can process such data only if they meet one of the conditions of Article 9 of UK GDPR—for example, where processing is necessary to comply with employment law provisions, or for reasons of substantial public interest. Without a lawful basis and compliance with relevant conditions, such processing of biometric data is prohibited.
Amendments 197B and 197C in the name of my noble friend Lord Holmes would also impose new, prescriptive requirements on organisations processing, and intending to process, biometric data and setting unlimited fines for non-compliance. We consider that such amendments would have significant unintended consequences. There are many everyday uses of biometrics data, such as using your thumbprint to access your phone. If every organisation that launched a new product had to comply with the proposed requirements, it would introduce significant and unnecessary new burdens and would discourage innovation, undermining the aims of this Bill. For these reasons, I respectfully ask my noble friend not to move these amendments.
The Government deem Amendment 238 unnecessary, as using biometric data—
I am sorry, but I am wondering whether the Minister is going to say any more on the amendment in the name of the noble Lord, Lord Holmes. Can I be clear? The Minister said that the ICO is the best place to oversee these issues, but the noble Lord’s amendment recognises that; it just says that there should be a dedicated biometrics unit with specialists, et cetera, underneath it. I am looking towards the noble Lord—yes, he is nodding in agreement. I do not know that the Minister dismissed that idea, but I think that this would be a good compromise in terms of assuaging our concerns on this issue.
I apologise if I have misunderstood. It sounds like it would be a unit within the ICO responsible for that matter. Let me take that away if I have misunderstood—I understood it to be a separate organisation altogether.
The Government deem Amendment 238 unnecessary, as using biometric data to categorise or make inferences about people, whether using algorithms or otherwise, is already subject to the general data protection principles and the high data protection standards of the UK’s data protection framework as personal data. In line with ICO guidance, where the processing of biometric data is intended to make an inference linked to one of the special categories of data—for example, race or ethnic origin—or the biometric data is processed for the intention of treating someone differently on the basis of inferred information linked to one of the special categories of data, organisations should treat this as special category data. These protections ensure that this data, which is not used for identification purposes, is sufficiently protected.
Similarly, Amendment 286 intends to widen the scope of the Forensic Information Databases Service—FINDS—strategy board beyond oversight of biometrics databases for the purpose of identification to include “classification” purposes as well. The FINDS strategy board currently provides oversight of the national DNA database and the national fingerprint database. The Bill puts oversight of the fingerprint database on the same statutory footing as that of the DNA database and provides the flexibility to add oversight of new biometric databases, where appropriate, to provide more consistent oversight in future. The delegated power could be used in the medium term to expand the scope of the board to include a national custody image database, but no decisions have yet been taken. Of course, this will be kept under review, and other biometric databases could be added to the board’s remit in future should these be created and should this be appropriate. For the reasons I have set out, I hope that the noble Baroness, Lady Jones of Whitchurch, will therefore agree not to move Amendments 238 and 286.
Responses to the data reform public consultation in 2021 supported the simplification of the complex oversight framework for police use of biometrics and surveillance cameras. Clauses 147 and 148 of the Bill reflect that by abolishing the Biometrics and Surveillance Camera Commissioner’s roles while transferring the commissioner’s casework functions to the Investigatory Powers Commissioner’s Office.
Noble Lords referred to the CRISP report, which was commissioned by Fraser Sampson—the previous commissioner—and directly contradicts the outcome of the public consultation on data reform in 2021, including on the simplification of the oversight of biometrics and surveillance cameras. The Government took account of all the responses, including from the former commissioner, in developing the policies set out in the DPDI Bill.
There will not be a gap in the oversight of surveillance as it will remain within the statutory regulatory remit of other organisations, such as the Information Commissioner’s Office, the Equality and Human Rights Commission, the Forensic Science Regulator and the Forensic Information Databases Service strategy board.
One of the crucial aspects has been the reporting of the Biometrics and Surveillance Camera Commissioner. Where is there going to be and who is going to have a comprehensive report relating to the use of surveillance cameras and the biometric data contained within them? Why have the Government decided that they are going to separate out the oversight of biometrics from, in essence, the surveillance aspects? Are not the two irretrievably brought together by things such as live facial recognition?
Yes. There are indeed a number of different elements of surveillance camera oversight; those are reflected in the range of different bodies doing that it. As to the mechanics of the production of the report, I am afraid that I do not know the answer.
Does the Minister accept that the police are one of the key agencies that will be using surveillance cameras? He now seems to be saying, “No, it’s fine. We don’t have one single oversight body; we had four at the last count”. He probably has more to say on this subject but is that not highly confusing for the police when they have so many different bodies that they need to look at in terms of oversight? Is it any wonder that people think the Bill is watering down the oversight of surveillance camera use?
No. I was saying that there was extensive consultation, including with the police, and that that has resulted in these new arrangements. As to the actual mechanics of the production of an overall report, I am afraid that I do not know but I will find out and advise noble Lords.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services also inspects, monitors and reports on the efficiency and effectiveness of the police, including their use of surveillance cameras. All of these bodies have statutory powers to take the necessary action when required. The ICO will continue to regulate all organisations’ use of these technologies, including being able to take action against those not complying with data protection law, and a wide range of other bodies will continue to operate in this space.
On the first point made by the noble Lord, Lord Vaux, where any of the privacy concerns he raises concern information that relates to an identified or identifiable living individual, I can assure him that this information is covered by the UK’s data protection regime. This also includes another issue raised by the noble Lord—where the ANPR captures a number-plate that can be linked to an identifiable living individual—as this would be the processing of personal data and thus governed by the UK’s data protection regime and regulated by the ICO.
For the reasons I have set out, I maintain that these clauses should stand part of the Bill. I therefore hope that the noble Lord, Lord Clement-Jones, will withdraw his stand part notices on Clauses 147 and 148.
Clause 149 does not affect the office of the Biometrics and Surveillance Camera Commissioner, which the noble Lord seeks to maintain through his amendment. The clause’s purpose is to update the name of the national DNA database board and update its scope to include the national fingerprint database within its remit. It will allow the board to produce codes of practice and introduce a new delegated power to add or remove biometric databases from its remit in future via the affirmative procedure. I therefore maintain that this clause should stand part of the Bill and hope that the noble Lord will withdraw his stand part notice.
Clauses 147 and 148 will improve consistency in the guidance and oversight of biometrics and surveillance cameras by simplifying the framework. This follows public consultation, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication. The Government feel that a review, as proposed, so quickly after the Bill is enacted is unnecessary. It is for these reasons that I cannot accept Amendment 292 in the name of the noble Lord, Lord Clement-Jones.
I turn now to the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove Clauses 130 to 132. These clauses make changes to the Counter-Terrorism Act 2008, which provides the retention regime for biometric data held on national security grounds. The changes have been made only following a formal request from Counter Terrorism Policing to the Home Office. The exploitation of biometric material, including from international partners, is a valuable tool in maintaining the UK’s national security, particularly for ensuring that there is effective tripwire coverage at the UK border. For example, where a foreign national applies for a visa to enter the UK, or enters the UK via a small boat, their biometrics can be checked against Counter Terrorism Policing’s holdings and appropriate action to mitigate risk can be taken, if needed.
My Lords, to go back to some of the surveillance points, one of the issues is the speed at which technology is changing, with artificial intelligence and all the other things we are seeing. One of the roles of the commissioner has been to keep an eye on how technology is changing and to make recommendations as to what we do about the impacts of that. I cannot hear, in anything the noble Viscount is saying, how that role is replicated in what is being proposed. Can he enlighten me?
Yes, indeed. In many ways, this is advantageous. The Information Commissioner obviously has a focus on data privacy, whereas the various other organisations, particularly BSCC, EHRC and the FINDS Board, have subject-specific areas of expertise on which they will be better placed to horizon-scan and identify new emerging risks from technologies most relevant to their area.
Is the noble Viscount saying that splitting it all up into multiple different places is more effective than having a single dedicated office to consider these things? I must say, I find that very hard to understand.
I do not think we are moving from a simple position. We are moving from a very complex position to a less complex position.
Can the Minister reassure the Committee that, under the Government’s proposals, there will be sufficient reporting to Parliament, every year, from all the various bodies to which he has already referred, so that Parliament can have ample opportunity to review the operation of this legislation as the Bill stands at the moment?
Yes, indeed. The information commission will be accountable to Parliament. It is required to produce transparency and other reports annually. For the other groups, I am afraid that many of them are quite new to me, as this is normally a Home Office area, but I will establish what their accountability is specifically to Parliament, for BSSC and the—
Will the Minister write to the Committee, having taken advice from his Home Office colleagues?
My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.
When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.
(8 months, 1 week ago)
Grand CommitteeI welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.
I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.
Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.
I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.
Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.
We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.
Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?
The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.
The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?
I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.
I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.
I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.
On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.
Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.
My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.
The Minister mentioned prosecutions and legal redress in the UK from international data transfer breaches. Can he share some examples of that, maybe by letter? I am not aware of that being something with a long precedent.
A number of important points were raised there. Yes, of course I will share—
I am sorry to interrupt my noble friend, but the point I made—this now follows on from other remarks—was that these requirements have been in place for a long time, and we are seeing abuses. Therefore, I was hoping that my noble friend would be able to offer changes in the Bill that would put more emphasis on dealing with these breaches. Otherwise, as has been said, we look as though we are going backwards, not forwards.
As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.
Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.
I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.
Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.
For these reasons, I hope noble Lords will not press their amendments.
My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.
By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.
The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?
The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.
Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.
My Lords, I am grateful to the noble Lord, Lord Bethell, and his cosignatories for bringing this comprehensive amendment before us this afternoon. As we have heard, this is an issue that was debated at length in the Online Safety Act. It is, in effect, unfinished business. I pay tribute to the noble Lords who shepherded that Bill through the House so effectively. It is important that we tie up the ends of all the issues. The noble Lord made significant progress, but those issues that remain unresolved come, quite rightly, before us now, and this Bill is an appropriate vehicle for resolving those outstanding issues.
As has been said, the heart of the problem is that tech companies are hugely protective of the data they hold. They are reluctant to share it or to give any insight on how their data is farmed and stored. They get to decide what access is given, even when there are potentially illegal consequences, and they get to judge the risk levels of their actions without any independent oversight.
During the course of the Online Safety Bill, the issue was raised not only by noble Lords but by a range of respected academics and organisations representing civil society. They supported the cross-party initiative from Peers calling for more independent research, democratic oversight and accountability into online safety issues. In particular, as we have heard, colleagues identified a real need for approved researchers to check the risks of non-compliance in the regulated sectors of UK law by large tech companies—particularly those with large numbers of children accessing the services. This arose because of the increasing anecdotal evidence that children’s rights were being ignored or exploited. The noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, have given an excellent exposition of the potential and real harms that continue to be identified by the lack of regulatory action on these issues.
Like other noble Lords, I welcome this amendment. It is well-crafted, takes a holistic approach to the problem, makes the responsibilities of the large tech companies clear and establishes a systematic research base of vetted researchers to check compliance. It also creates important criteria for the authorisation of those vetted researchers: the research must be in the public interest, must be transparent, must be carried out by respected researchers, and must be free from commercial interests so that companies cannot mark their own homework. As has been said, it mirrors the provisions in the EU Digital Services Act and ensures comparable research opportunities. That is an opportunity for the UK to maintain its status as one of the top places in the world for expertise on the impact of online harms.
Since the Online Safety Act was passed, the Information Commissioner has been carrying out further work on the children’s code of practice. The latest update report says:
“There has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms”.
That is all well and good but the ICO and other regulators are still reliant on the information provided by the tech companies on how their data is used and stored and how they mitigate risk. Their responsibilities would be made much easier if they had access to properly approved and vetted independent research information that could inform their decisions.
I am grateful to noble Lords for tabling this amendment. I hope that the Minister hears its urgency and necessity and that he can assure us that the Government intend to table a similar amendment on Report—as the noble Baroness, Lady Kidron, said, no more “wait and see”. The time has come to stop talking about this issue and take action. Like the noble Lord, Lord Clement-Jones, I was in awe of the questions that the noble Baroness came up with and do not envy the Minister in trying to answer them all. She asked whether, if necessary, it could be done via a letter but I think that the time has come on this and some other issues to roll up our sleeves, get round the table and thrash it out. We have waited too long for a solution and I am not sure that exchanges of letters will progress this in the way we would hope. I hope that the Minister will agree to convene some meetings of interested parties—maybe then we will make some real progress.
My Lords, as ever, many thanks to all noble Lords who spoke in the debate.
Amendment 135, tabled by my noble friend Lord Bethell, would enable researchers to access data from data controllers and processors in relation to systemic risks to the UK and non-compliance with regulatory law. The regime would be overseen by the ICO. Let me take this opportunity to thank both my noble friend for the ongoing discussions we have had and the honourable Members in the other place who are also interested in this measure.
Following debates during the passage of the Online Safety Act, the Government have been undertaking further work in relation to access to data for online safety researchers. This work is ongoing and, as my noble friend Lord Bethell will be aware, the Government are having ongoing conversations on this issue. As he knows, the online safety regime is very broad and covers issues that have an impact on national security and fraud. I intend to write to the Committee with an update on this matter, setting out our progress ahead of Report, which should move us forward.
While we recognise the benefits of improving researchers’ access to data—for example, using data to better understand the impact of social media on users—this is a highly complex issue with several risks that are not currently well understood. Further analysis has reiterated the complexities of the issue. My noble friend will agree that it is vital that we get this right and that any policy interventions are grounded in the evidence base. For example, there are risks in relation to personal data protection, user consent and the disclosure of commercially sensitive information. Introducing a framework to give researchers access to data without better understanding these risks could have significant consequences for data security and commercially sensitive information, and could potentially destabilise any data access regime as it is implemented.
In the meantime, the Online Safety Act will improve the information available to researchers by empowering Ofcom to require major providers to publish a broad range of online safety information through annual transparency reports. Ofcom will also be able to appoint a skilled person to undertake a report to assess compliance or to develop its understanding of the risk of non-compliance and how to mitigate it. This may include the appointment of independent researchers as skilled persons. Further, Ofcom is required to conduct research into online harms and has the power to require companies to provide information to support this research activity.
Moving on to the amendment specifically, it is significantly broader than online safety and the EU’s parallel Digital Services Act regime. Any data controllers and processors would be in scope if they have more than 1 million UK users or customers, if there is a large concentration of child users or if the service is high-risk. This would include not just social media platforms but any organisation, including those in financial services, broadcasting and telecoms as well as any other large businesses. Although we are carefully considering international approaches to this issue, it is worth noting that much of the detail about how the data access provisions in the Digital Services Act will work in practice is yet to be determined. Any policy interventions in this space should be predicated on a robust evidence base, which we are in the process of developing.
The amendment would also enable researchers to access data to research systemic risks to compliance with any UK regulatory law that is upheld by the ICO, Ofcom, the Competition and Markets Authority, and the Financial Conduct Authority. The benefits and risks of such a broad regime are not understood and are likely to vary across sectors. It is also likely to be inappropriate for the ICO to be the sole regulator tasked with vetting researchers across the remits of the other regulators. The ICO may not have the necessary expertise to make this determination about areas of law that it does not regulate.
Ofcom already has the power to gather information that it requires for the purpose of exercising its online safety functions. This power applies to companies in scope of the duties and, where necessary, to other organisations or persons who may have relevant information. Ofcom can also issue information request notices to overseas companies as well as to UK-based companies. The amendment is also not clear about the different types of information that a researcher may want to access. It refers to a data controller and processors—concepts that relate to the processing of personal data under data protection law—yet researchers may also be interested in other kinds of data, such as information about a service’s systems and processes.
Although the Government continue to consider this issue—I look forward to setting out our progress between now and Report—for the reasons I have set out, I am not able to accept this amendment. I will certainly write to the Committee on this matter and to the noble Baroness, Lady Kidron, with a more detailed response to her questions—there were more than four of them, I think—in particular those about Ofcom.
Perhaps I could encourage the Minister to say at least whether he is concerned that a lack of evidence might be impacting on the codes and powers that we have given to Ofcom in order to create the regime. I share his slight regret that Ofcom does not have this provision that is in front of us. It may be that more than one regulator needs access to research data but it is the independents that we are talking about. We are not talking about Ofcom doing things and the ICO doing things. We are talking about independent researchers doing things so that the evidence exists. I would like to hear just a little concern that the regime is suffering from a lack of evidence.
I am thinking very carefully about how best to answer. Yes, I do share that concern. I will set this out in more detail when I write to the noble Baroness and will place that letter in the House of Lords Library. In the meantime, I hope that my noble friend will withdraw his amendment.
I am enormously grateful to the Minister for his response. However, it falls short of my hopes. Obviously, I have not seen the letter that he is going to send us, but I hope that the department will have taken on board the commitments made by previous Ministers during discussions on the Online Safety Bill and the very clear evidence that the situation is getting worse, not better.
Any hope that the tech companies would somehow have heard the debate in the House of Lords and that it would have occurred to them that they needed to step up to their responsibilities has, I am afraid, been dashed by their behaviours in the last 18 months. We have seen a serious withdrawal of existing data-sharing provisions. As we approach even more use of AI, the excitement of the metaverse, a massive escalation in the amount of data and the impact of their technologies on society, it is extremely sobering to think that there is almost no access to the black box of their data.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling these amendments and raising important points about the Information Commissioner’s independence and authority to carry out his role efficiently. The amendments from the noble Lord, Lord Clement-Jones, range widely, and I have to say that I have more sympathy with some of them than others.
I start by welcoming some of the things in the Bill—I am very pleased to be able to do this. It is important that we have an independent regulator that is properly accountable to Parliament, and this is vital for a properly functioning data protection regime. We welcome a number of the changes that have been made to the ICO’s role in the Bill. In particular, we think the move to have a board and a chief executive model, with His Majesty appointing the chair of the board, is the right way to go. We also welcome the strengthening of enforcement powers and the obligation to establish stakeholder panels to inform the content of codes of practice. The noble Baroness, Lady Kidron, also highlighted that.
However, we share the concern of the noble Lord, Lord Clement-Jones, about the Secretary of State’s requirement every three years to publish a statement of strategic priorities for the commissioner to consider, respond to and have regard to. We share his view, and that of many stakeholder groups, that this crosses the line into political involvement and exposes the ICO to unwarranted political direction and manipulation. We do not believe that this wording provides sufficient safeguards from that in its current form.
I have listened carefully to the explanation of the noble Lord, Lord Clement-Jones, of Amendment 138. I understand his concern, but we are going in a slightly different direction to him on this. We believe that the reality is that the ICO does not have the resources to investigate every complaint. He needs to apply a degree of strategic prioritisation in the public interest. I think that the original wording in the Bill, rather than the noble Lord’s amendment, achieved that objective more clearly.
Amendment 140, in the name of the noble Lord, Lord Clement-Jones, raises a significant point about businesses being given assured advice to ensure that they follow the procedures correctly, and we welcome that proposal. There is a role for leadership of the ICO in this regard. His proposal also addresses the Government’s concern that data controllers struggle to understand how they should be applying the rules. This is one of the reasons for many of the changes that we have considered up until now. I hope that the Minister will look favourably on this proposal and agree that we need to give more support to businesses in how they follow the procedures.
Finally, I have added my name to the amendment of the noble Baroness, Lady Kidron, which rightly puts a deadline on the production of any new codes of practice, and a deadline on the application of any transitional arrangements which apply in the meantime. We have started using the analogy of the codes losing their champions, and in general terms she is right. Therefore, it is useful to have a deadline, and that is important to ensure delivery. This seems eminently sensible, and I hope the Minister agrees with this too.
Amendment 150 from the noble Baroness, Lady Kidron, also requires the ICO annual report to spell out specifically the steps being taken to roll out the age-appropriate design code and to specifically uphold children’s data rights. Going back to the codes losing their champions, I am sure that the Minister got the message from the noble Baronesses, Lady Kidron and Lady Harding, that in this particular case, this is not going to happen, and that this code and the drive to deliver it will be with us for some time to come.
The noble Baroness, Lady Kidron, raised concerns about the approach of the ICO, which need to be addressed. We do not want a short-term approach but a longer-term approach, and we want some guarantees that the ICO is going to address some of the bigger issues that are being raised by the age-appropriate design code and other codes. Given the huge interest in the application of children’s data rights in this and other Bills, I am sure that the Information Commissioner will want to focus his report on his achievements in this space. Nevertheless, for the avoidance of doubt, it is useful to have it in the Bill as a specific obligation, and I hope the Minister agrees with the proposal.
We have a patchwork of amendments here. I am strongly in support of some; on others, perhaps the noble Lord and I can debate further outside this Room. In the meantime, I am interested to hear what the Minister has to say.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Kidron, and other noble Lords who have tabled and signed amendments in this group. I also observe what a pleasure it is to be on a Committee with Batman and Robin—which I was not expecting to say, and which may be Hansard’s first mention of those two.
The reforms to the Information Commissioner’s Office within the Bill introduce a strategic framework of objectives and duties to provide context and clarity on the commissioner’s overarching objectives. The reforms also put best regulatory practice on to a statutory footing and bring the ICO’s responsibilities into line with that of other regulators.
With regard to Amendment 138, the principal objective upholds data protection in an outcomes-focused manner that highlights the discretion of the Information Commissioner in securing those objectives, while reinforcing the primacy of data protection. The requirement to promote trust and confidence in the use of data will encourage innovation across current and emerging technologies.
I turn now to the question of Clause 32 standing part. As part of our further reforms, the Secretary of State can prepare a statement of strategic priorities for data protection, which positions these aims within its wider policy agenda, thereby giving the commissioner helpful context for its activities. While the commissioner must take the statement into account when carrying out functions, they are not required to act in accordance with it. This means that the statement will not be used in a way to direct what the commissioner may and may not do when carrying out their functions.
Turning to Amendment 140, we believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. This amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without necessarily full knowledge of the facts, undermining their regulatory enforcement role.
In response to the amendments concerning Clauses 33 to 35 standing part, I can say that we are introducing a series of measures to increase accountability, robustness and transparency in the codes of practice process, while safeguarding the Information Commissioner’s role. The requirements for impact assessments and panel of experts mean that the codes will consider the application to, and impact on, all potential use cases. Given that the codes will have the force of law, the Secretary of State must have the ability to give her or his comments. The Information Commissioner is required to consider but not to act on those comments, preserving the commissioner’s independence. It remains for Parliament to give approval for any statutory code produced.
Amendments 142 and 143 impose a requirement on the ICO to prepare codes and for the Secretary of State to lay them in Parliament as quickly as practicable. They also limit the time that transitional provisions can be in place to a maximum of 12 months. This could mean that drafting processes are truncated or valid concerns are overlooked to hit a statutory deadline, rather than the codes being considered properly to reflect the relevant perspectives.
Given the importance of ensuring that any new codes are robust, comprehensive and considered, we do not consider imposing time limits on the production of codes to be a useful tool.
Finally, Amendment 150—
We had this debate during the passage of the Online Safety Act. In the end, we all agreed—the House, including the Government, came to the view—that two and a half years, which is 18 months plus a transition period, was an almost egregious amount of time considering the rate at which the digital world moves. So, to consider that more than two and a half years might be required seems a little bit strange.
I absolutely recognise the need for speed, and my noble friend Lady Harding made this point very powerfully as well, but what we are trying to do is juggle that need with the need to go through the process properly to design these things well. Let me take it away and think about it more, to make sure that we have the right balancing point. I very much see the need; it is a question of the machinery that produces the right outcome in the right timing.
Before the Minister sits down, I would very much welcome a meeting, as the noble Baroness, Lady Harding, suggested. I do not think it is useful for me to keep standing up and saying, “You are watering down the code”, and for the Minister to stand up and say, “Oh no, we’re not”. We are not in panto here, we are in Parliament, and it would be a fantastic use of all our time to sit down and work it out. I would like to believe that the Government are committed to data protection for children, because they have brought forward important legislation in this area. I would also like to believe that the Government are proud of a piece of legislation that has spread so far and wide—and been so impactful—and that they would not want to undermine it. On that basis, I ask the Minister to accede to the noble Baroness’s request.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.
We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.
I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.
My Lords, I have added my name to Amendment 146 in the name of the noble Baroness, Lady Kidron, and I thank all noble Lords who have spoken.
These days, most children learn to swipe an iPad long before they learn to ride a bike. They are accessing the internet at ever younger ages on a multitude of devices. Children are choosing to spend more time online, browsing social media, playing games and using apps. However, we also force children to spend an increasing amount of time online for their education. A growing trend over the last decade or more, this escalated during the pandemic. Screen time at home became lesson time; it was a vital educational lifeline for many in lockdown.
Like other noble Lords, I am not against edtech, but the reality is that the necessary speed of the transition meant that insufficient regard was paid to children’s rights and the data practices of edtech. The noble Baroness, Lady Kidron, as ever, has given us a catalogue of abuses of children’s data which have already taken place in schools, so there is a degree of urgency about this, and Amendment 146 seeks to rectify the situation.
One in five UK internet users are children. Schools are assessing their work online; teachers are using online resources and recording enormous amounts of sensitive data about every pupil. Edtech companies have identified that such a large and captive population is potentially profitable. This amendment reinforces that children are also a vulnerable population and that we must safeguard their data and personal information on this basis. Their rights should not be traded in as the edtech companies chase profits.
The code of practice proposed in this amendment establishes standards for companies to follow, in line with the fundamental rights and freedoms as set out in the UN Convention on the Rights of the Child. It asserts that they are entitled to a higher degree of protection than adults in the digital realm. It would oblige the commissioner to prepare a code of practice which ensures this. It underlines that consultations with individuals and organisations who have the best interests of children at heart is vital, so that the enormous edtech companies cannot bamboozle already overstretched teachers and school leaders.
In education, data has always been processed from children in school. It is necessary for the school’s functioning and to monitor the educational development of individual children. Edtech is now becoming a permanent fixture in children’s schooling and education, but it is largely untested, unregulated and unaccountable. Currently, it is impossible to know what data is collected by edtech providers and how they are using it. This blurs the boundaries between the privacy-preserving and commercial parts of services profiting from children’s data.
Why is this important? First, education data can reveal particularly sensitive and protected characteristics about children: their ethnicity, religion, disability or health status. Such data can also be used to create algorithms that profile children and predict or assess their academic ability and performance; it could reinforce prejudice, create siloed populations or entrench low expectations. Secondly, there is a risk that data-profiling children can lead to deterministic outcomes, defining too early what subjects a child is good at, how creative they are and what they are interested in. Safeguards must be put in place in relation to the processing of children’s personal data in schools to protect those fundamental rights. Thirdly, of course, is money. Data is appreciating in value, resulting in market pressure for data to be collected, processed, shared and reused. Increasingly, such data processed from children in schools is facilitated by edtech, an already major and expanding sector with a projected value of £3.4 billion.
The growth of edtech’s use in schools is promoted by the Department for Education’s edtech strategy, which sets out a vision for edtech to be an
“inseparable thread woven throughout the processes of teaching and learning”.
Yet the strategy gives little weight to data protection beyond noting the importance of preventing data breaching. Tech giants have become the biggest companies in the world because they own data on us. Schoolchildren have little choice as to their involvement with these companies in the classroom, so we have a moral duty to ensure that they are protected, not commodified or exploited, when learning. It must be a priority for the Government to keep emerging technologies in education under regular review.
Equally important is that the ICO should invest in expertise specific to the domain of education. By regularly reviewing emerging technologies—those already in use and those proposed for use—in education, and their potential risks and impacts, such experts could provide clear and timely guidance for schools to protect individual children and entire cohorts. Amendment 146 would introduce a new code of practice on the processing and use of children’s data by edtech providers. It would also ensure that edtech met their legal obligations under the law, protected children’s data and empowered schools.
I was pleased to hear that the noble Baroness, Lady Kidron, has had constructive discussions with the Education Minister, the noble Baroness, Lady Barran. The way forward on this matter is some sort of joint work between the two departments. The noble Baroness, Lady Kidron, said that she hopes the Minister today will respond with equal positivity; he could start by supporting the principles of this amendment. Beyond that, I hope that he will agree to liaise with the Department for Education and embrace the noble Baroness’s request for more meetings to discuss this issue on a joint basis.
I am grateful, as ever, to the noble Baroness, Lady Kidron, for both Amendment 146 and her continued work in championing the protection of children.
Let me start by saying that the Government strongly agree with the noble Baroness that all providers of edtech services must comply with the law when collecting and making decisions about the use of children’s data throughout the duration of their processing activities. That said, I respectfully submit that this amendment is not necessary, for the reasons I shall set out.
The ICO already has existing codes and guidance for children and has set out guidance about how the children’s code, data protection and e-privacy legislation apply to edtech providers. Although the Government recognise the value that ICO codes can have in promoting good practice and improving compliance, they do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by them.
The guidance covers broad topics, including choosing a lawful basis for the processing; rules around information society services; targeting children with marketing; profiling children or making automated decisions about them; data sharing; children’s data rights; and exemptions relating to children’s data. Separately, as we have discussed throughout this debate, the age-appropriate design code deals specifically with the provision of online services likely to be accessed by children in the UK; this includes online edtech services. I am pleased to say that the Department for Education has begun discussions with commercial specialists to look at strengthening the contractual clauses relating to the procurement of edtech resources to ensure that they comply with the standards set out in the UK GDPR and the age-appropriate design code.
On the subject of requiring the ICO to develop a report with the edtech sector, with a view to creating a certification scheme and assessing compliance and conformity with data protection, we believe that such an approach should be at the discretion of the independent regulator.
The issues that have been raised in this very good, short debate are deeply important. Edtech is an issue that the Government are considering carefully—especially the Department for Education, given the increasing time spent online for education. I note that the DPA 2018 already contains a power for the Secretary of State to request new codes of practice, which could include one on edtech if the evidence warranted it. I would be happy to return to this in future but consider the amendment unnecessary at this time. For the reasons I have set out, I am not able to accept the amendment and hope that the noble Baroness will withdraw it.
I thank everyone who spoke, particularly for making it absolutely clear that not one of us, including myself, is against edtech. We just want it to be fair and want the rules to be adequate.
I am particularly grateful to the noble Baroness, Lady Jones, for detailing what education data includes. It might feel as though it is just about someone’s exam results or something that might already be public but it can include things such as how often they go to see the nurse, what their parents’ immigration status is or whether they are late. There is a lot of information quite apart from this personalised education provision, to which the noble Baroness referred. In fact, we have a great deal of emerging evidence that it has no pedagogical background to it. There is also the question of huge investment right across the sector in things where we do not know what they are. I thank the noble Baroness for that.
As to the Minister’s response, I hope that he will forgive me for being disappointed. I am grateful to him for reminding us that the Secretary of State has that power under the DPA 2018. I would love for her to use that power but, so far, it has not been forthcoming. The evidence we saw from the freedom of information request is that the scheme the department wanted to put in place has been totally retracted—and clearly for resource reasons rather than because it is not needed. I find it quite surprising that the Minister can suggest that it is all gung ho here in the UK but that Germany, Holland, France, et cetera are being hysterical in regard to this issue. Each one of them has found it to be egregious.
Finally, the AADC applies only to internet society services; there is an exception for education. Where they are joint controllers, they are outsourcing the problems to the schools, which have no level of expertise in this and just take default settings. It is not good enough, I am afraid. I feel bound to say this: I understand the needs of parliamentary business, which puts just a handful of us in this Room to discuss things out of sight, but, if the Government are not willing to protect children’s data at school, when they are in loco parentis to our children, I am really bewildered as to what this Bill is for. Education is widely understood to be a social good but we are downgrading the data protections for children and rejecting every single positive move that anybody has made in Committee. I beg leave to withdraw my amendment but I will bring this back on Report.
(8 months, 3 weeks ago)
Grand CommitteeMy Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.
Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.
I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended
“by adding or varying safeguards”.
The Delegated Powers Committee quotes the department saying that
“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”
afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.
The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.
We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.
The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.
I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.
I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.
I thank the noble Lords, Lord Clement-Jones and Lord Knight, my noble friend Lord Holmes and the noble Baronesses, Lady Jones, Lady Kidron and Lady Bennett—
I apologise to my noble friend. I cannot be having a senior moment already—we have only just started. I look forward to reading that part in Hansard.
I can reassure noble Lords that data subjects still have the right to object to solely automated decision-making. It is not an absolute right in all circumstances, but I note that it never has been. The approach taken in the Bill complements the UK’s AI regulation framework, and the Government are committed to addressing the risks that AI poses to data protection and wider society. Following the publication of the AI regulation White Paper last year, the Government started taking steps to establish a central AI risk function that brings together policymakers and AI experts with the objective of identifying, assessing and preparing for AI risks. To track identified risks, we have established an initial AI risk register, which is owned by the central AI risk function. The AI risk register lists individual risks associated with AI that could impact the UK, spanning national security, defence, the economy and society, and outlines their likelihood and impact. We have also committed to engaging on and publishing the AI risk register in spring this year.
I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?
Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.
This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.
I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.
I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.
Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.
I am having a senior moment as well. Where are the outcomes written? What are we measuring this against? I like the idea; it sounds great—management terminology—but I presume that it is written somewhere and that we could easily add children’s rights to the outcomes as the noble Baroness suggests. Where are they listed?
I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.
I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.
I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.
As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.
Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—
Has the Minister moved on from our Amendments 58 and 59? He was talking about varying safeguards. I am not quite sure where he is.
It is entirely my fault; when I sit down and stand up again, I lose my place.
We would always take the views of the DPRRC very seriously on that. Clearly, the Bill is being designed without the idea in mind of losing or diminishing any of those safeguards; otherwise, it would have simply said in the Bill that we could do that. I understand the concern that, by varying them, there is a risk that they would be diminished. We will continue to find a way to take into account the concerns that the noble Baroness has set out, along with the DPRRC. In the interim, let me perhaps provide some reassurance that that is, of course, not the intention.
I feel under amazing pressure to get the names right, especially given the number of hours we spend together.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.
Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.
The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.
In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.
More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.
For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.
My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?
I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.
I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.
As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.
On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
It is already explicit that:
“Activities addressed specifically to children shall receive specific attention”.
That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.
Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.
I have a question on this. If the Minister is arguing that this should be by way of amendment of the age-related code, would there not be an argument for giving that code some statutory effect?
On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.
Yes, I entirely agree with that, but I add that we need it upstream and downstream.
For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—
Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.
Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.
Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.
Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.
Some 50,000 organisations access that information, but does the Government have any data on it? I am not asking for it now, but maybe the Minister could go away and have a look at this. We have heard that other countries have opened up this data. Are they seeing an increase? That is just a number; it does not tell us how many people are denied access to the data.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
My Lords, I thank the Minister for his response. There are a number of different elements to this group.
The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.
My Lords, I thank all noble Lords who have contributed to this very wide-ranging debate. Our amendments cover a lot of common ground, and we are in broad agreement on most issues, so I hope noble Lords will bear with me if I primarily focus on the amendments that I have tabled, although I will come back to other points.
We have given notice of our intention to oppose Clause 16 standing part of the Bill which is similar to Amendment 80 tabled by the noble Lord, Lord Clement-Jones, which probes why the Government have found it necessary to remove the requirement that companies outside the UK should appoint a representative within the UK. The current GDPR rules apply to all those active in the UK market, regardless of whether their organisation is based or located in the UK. The intention is that the representative will ensure UK compliance and act as a primary source of contact for data subjects. Without this clause, data subjects will be forced to deal with overseas data handlers, with all the cultural and language barriers that might ensue. There is no doubt that this will limit their rights to apply UK data standards.
In addition, as my colleagues in the Commons identified, the removal of the provisions in Clause 16 was not included in the Government’s consultation, so stakeholders have not had the chance to register some of the many practical concerns that they feel will arise from this change. There is also little evidence that compliance with Article 27 is an unnecessary barrier to responsible data use by reputable overseas companies. Again, this was a point made by the noble Lord, Lord Clement-Jones. In fact, the international trend is for more countries to add a representative obligation to their data protection laws, so we are becoming outriders on the global stage.
Not only is this an unnecessary change but, compared to other countries, it will send a signal that our data protection rights are being eroded in the UK. Of course, this raises the spectre of the EU revisiting whether our UK adequacy status should be retained. It also has implications for the different rules that might apply north and south of the border in Ireland so, again, if we are moving away from the standard rules applied by other countries, this has wider implications that we need to consider.
For many reasons, I challenge the Government to explain why this change was felt to be necessary. The noble Lord, Lord Clement-Jones, talked about whether the cost was really a factor. It did not seem that there were huge costs, compared to the benefits of maintaining the current system, and I would like to know in more detail why the Government are doing this.
Our Amendments 81 and 90 seek to ensure that there is a definition of “high-risk processing” in the Bill. The current changes in Clauses 17 and 20 have the effect of watering down data controllers’ responsibilities, from carrying out data protection impact assessments to assessing high-risk processing on the basis of whether it was necessary and what risks are posed. But nowhere does it say what constitutes high-risk processing—it is left to individual organisations to make that judgment—and nowhere does it explain what “necessary” means in this context. Is it also expected to be proportionate, as in the existing standards? This lack of clarity has caused some consternation among stakeholders.
The Equality and Human Rights Commission argues that the proposed wording means that
“data controllers are unlikely to go beyond minimum requirements”,
so the wording needs to be more explicit. It also recommends that
“the ICO be required to provide detailed guidance on how ‘the rights and freedoms of individuals’ are to be considered in an Assessment of High Risk Processing”.
More crucially, the ICO has written to Peers, saying that the Bill should contain a list of
“activities that government and Parliament view as high-risk processing, similar to the current list set out at Article 35(3) of the UK GDPR”.
This is what our Amendments 81 and 90 aim to achieve. I hope the Minister can agree to take these points on board and come back with amendments to achieve this.
The ICO also makes the case for future-proofing the way in which high-risk processing is regulated by making a provision in the Bill for the ICO to further designate high-risk processing activities with parliamentary approval. This would go further than the current drafting of Clause 20, which contains powers for the ICO to give examples of high-risk profiling, but only for guidance. Again, I hope that the Minister can agree to take these points on board and come back with suitable amendments.
Our Amendments 99, 100 and 102 specify the need for wider factors in the proposed risk assessment list to ensure that it underpins our equality laws. Again, this was an issue about which stakeholders have raised concerns. The TUC and the Institute for the Future of Work make the point that data protection impact assessments are a crucial basis for consultation with workers and trade unions about the use of technology at work, and this is even more important as the complexities of AI come on stream. The Public Law Project argues that, without rigorous risk and impact analysis, disproportionate and discriminatory processes could be carried out before the harm comes to light.
The Equality and Human Rights Commission argues that data protection impact assessments
“provide a key mechanism for ensuring equality impacts are assessed when public and private sector organisations embed AI systems in their operations”.
It specifically recommends that express references in Article 35(7) of GDPR to “legitimate interests” and
“the rights and freedoms of data subjects”,
as well as the consultation obligations in Article 35(2), should be retained. I hope that the Minister can agree to take these recommendations on board and come back with suitable amendments to ensure that our equalities legislation is protected.
Our Amendments 106 and 108 focus on the particular responsibilities of data controllers to handle health data with specific obligations. This is an issue that we know, from previous debates, is a major cause for concern among the general public, who would be alarmed if they thought that the protections were being weakened.
The BMA has raised concerns that Clauses 20 and 21 will water down our high standards of data governance, which are necessary when organisations are handling health data. As it says,
“Removing the requirement to conduct a thorough assessment of risks posed to health data is likely to lead to a less diligent approach to data protection for individuals”.
It also argues that removing the requirement for organisations to consult the ICO on high-risk processing is,
“a backward step from good governance … when organisations are processing large quantities of sensitive health data.
Our amendments aim to address these concerns by specifying that, with regard to specific cases, such as the handling of health data, prior consultation with the ICO should remain mandatory. I hope that the Minister will see the sense in these amendments and recognise that further action is needed in this Bill to maintain public trust in how health data is managed for individual care and systemwide scientific development.
I realise that we have covered a vast range of issues, but I want to touch briefly on those raised by the noble Baroness, Lady Kidron. She is right that, in particular, applications of risk assessments by public bodies should be maintained, and we agree with her that Article 35’s privacy-by-design requirements should be retained. She once again highlighted the downgrading of children’s rights in this Bill, whether by accident or intent, and we look forward to seeing the exchange of letters with the Minister on this. I hope that we will all be copied in and that the Minister will take on board the widespread view that we should have more engagement on this before Report, because there are so many outstanding issues to be resolved. I look forward to the Minister’s response.
I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.
The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.
However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.
One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.
The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.
That is a very interesting question, but I am not sure that there is a read-across between the AI Act and our approach here. The fundamental starting point was that, although the provisions of the original GDPR are extremely important, the burdens of compliance were not proportionate to the results. The overall foundation of the DPDI is, while at least maintaining existing levels of protection, to reduce the burdens of demonstrating or complying with that regulation. That is the thrust of it—that is what we are trying to achieve—but noble Lords will have different views about how successful we are being at either of those. It is an attempt to make it easier to be safe and to comply with the regulations of the DPDI and the other Acts that govern data protection. That is where we are coming from and the thrust of what we are trying to achieve.
I note that, as we have previously discussed, children need particular protection when organisations are collecting and processing their personal data.
I did not interrupt before because I thought that the Minister would say more about the difference between high-risk and low-risk processing, but he is going on to talk about children. One of my points was about the request from the Information Commissioner—it is very unusual for him to intervene. He said that a list of high-risk processing activities should be set out in the Bill. I do not know whether the Minister was going to address that important point.
I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.
My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.
I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.
Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.
Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.
However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.
I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.
By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.
Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?
No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.
The Minister mentioned the freedom to choose the best solution. Would it be possible for someone to be told that their contact was someone who spoke a different language to them? Do they have to be able to communicate properly with the data subjects in this country?
Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.
I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.
The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.
Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.
I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.
The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.
Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.
The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.
The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.
Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?
Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.
Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.
The Minister has reached his 20 minutes. We nudged him at 15 minutes.
My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.
As long as that applies to us on occasion as well.
I apologise for going over. I will try to be as quick as possible.
I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.
Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.
Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.
Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.
Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe
“how the controller proposes to mitigate those risks”.
I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.
My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.
With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.
At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.
I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.
Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.
As the noble Lord, Lord Clement-Jones, explained, his intention to oppose the question that Clause 19 stands part seeks to retain the status quo. As I read Section 62 of the Data Protection Act 2016, it obliges competent authorities to keep logs of their processing activities, whether they be for collection, alteration, consultation, disclosure, combination or the erasing of personal data. The primary purpose is for self-monitoring purposes, largely linked to disciplinary proceedings, as the noble Lord said, where an officer has become a suspect by virtue of inappropriately accessing PNC-held data.
Clause 19 removes the requirement for a competent authority to record a justification in the logs only when consulting or disclosing personal data. The Explanatory Note to the Bill explains this change as follows:
“It is … technologically challenging for systems to automatically record the justification without manual input”.
That is not a sufficiently strong reason for removing the requirement, not least because the remaining requirements of Section 62 of the Data Protection Act 2018 relating to the logs of consultation and disclosure activity will be retained and include the need to record the date and time and the identity of the person accessing the log. Presumably they will be able to be manually input, so why remove the one piece of data that might, in an investigation of abuse or misuse of the system, be useful in terms of evidence and self-incrimination? I do not understand the logic behind that at all.
I rather think the noble Lord, Lord Clement-Jones, has an important point. He has linked it to those who have been unfortunate enough to be AIDS sufferers, and I am sure that there are other people who have become victims where cases would be brought forward. I am not convinced that the clause should stand part, and we support the noble Lord in seeking its deletion.
This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.
The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.
In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.
Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.
I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.
I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.
This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.
The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.
I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.
My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.
As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.
What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.
It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.
I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?
In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.
I thank the noble Baroness, Lady Kidron, for tabling these amendments. I absolutely recognise their intent. I understand that they are motivated by a concern about invisible types of processing or repurposing of data when it may not be clear to people how their data is being used or how they can exercise their rights in respect of the data.
On the specific points raised by noble Lords about intellectual property rather than personal data, I note that, in their response to the AI White Paper consultation, the Government committed soon to provide a public update on their approach to AI and intellectual property, noting the importance of greater transparency in the use of copyrighted material to train models, as well as labelling and attribution of outputs.
Amendment 103 would amend the risk-assessment provisions in Clause 20 so that any assessment of high-risk processing would always include an assessment of how the data controller would comply with the purpose limitation principle and how any new processing activity would be designed so that people could exercise their rights in respect of the data at the time it was collected and at any subsequent occasion.
I respectfully submit that this amendment is not necessary. The existing provisions in Clause 20, on risk assessments, already require controllers to assess the potential risks their processing activities pose to individuals and to describe how those risks would be mitigated. This would clearly include any risk that the proposed processing activities would not comply with the data protection principles—for example, because they lacked transparency—and would make it impossible for people to exercise their rights.
Similarly, any assessment of risk would need to take account of any risks related to difficulties in complying with the purpose limitation principle—for example, if the organisation had no way of limiting who the data would be shared with as a result of the proposed processing activity.
According to draft ICO guidance on generative AI, the legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR can be a valid lawful ground for training generative AI models on web-scrape data, but only when the model’s developer can ensure that they pass the three-part test—that is, they identify a legitimate interest, demonstrate that the processing is necessary for that purpose and demonstrate that the individual’s interests do not override the interest being pursued by the controller.
Controllers must consider the balancing test particularly carefully when they do not or cannot exercise meaningful control over the use of the model. The draft guidance further notes that it would be very difficult for data controllers to carry out their processing activities in reliance on the legitimate interests lawful ground if those considerations were not taken into account.
My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.
Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.
I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.
We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.
We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.
In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.
Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.
We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.
Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.
In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.
I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.
My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.
Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.
I thank the noble Lord, Lord Clement-Jones, for his amendment and his response, and I thank the noble Lord, Lord Bassam. The mechanism for monitoring international transfers was intended to be the subject for the next group in any case, and I would have hoped to give a full answer. I know we are all deeply disappointed that it looks as if we may not get to that group but, if the noble Lord is not willing to wait until we have that debate, I am very happy to write.
(8 months, 4 weeks ago)
Grand CommitteeMy Lords, I rise to speak to Amendments 11, 12, 13, 14, 15, 16, 17 and 18 and to whether Clauses 5 and 7 should stand part of the Bill. In doing so, I thank the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Jones and Lady Kidron, for their amendments. The amendments in the group, as we have heard, relate to Clauses 5 and 7, which make some important changes to Article 6 of the UK GDPR on the lawfulness of processing.
The first amendment in the group, Amendment 11, would create a new lawful ground, under Article 6(1) of UK GDPR, to enable the use of personal data published by public bodies with a person’s consent and to enable processing by public bodies for the benefit of the wider public. The Government do not believe it would be necessary to create additional lawful grounds for processing in these circumstances. The collection and publication of information on public databases, such as the list of company directors published by Companies House, should already be permitted by existing lawful grounds under either Article 6(1)(c), in the case of a legal requirement to publish information, or Article 6(1)(e) in the case of a power.
Personal data published by public bodies can already be processed by other non-public body controllers where their legitimate interests outweigh the rights and interests of data subjects. However, they must comply with their requirements in relation to that personal data, including requirements to process personal data fairly and transparently. I am grateful to the noble Lord, Lord Clement-Jones, for setting out where he thinks the gaps are, but I hope he will accept my reassurances that it should already be possible under the existing legislation and will agree to withdraw the amendment.
On Clause 5, the main objectives introduce a new lawful ground under Article 6(1) of the UK GDPR, known as “recognised legitimate interests”. It also introduces a new annexe to the UK GDPR, in Schedule 1 to the Bill, that sets out an exhaustive list of processing activities that may be undertaken by data controllers under this new lawful ground. If an activity appears on the list, processing may take place without a person’s consent and without balancing the controller’s interests against the rights and interests of the individual: the so-called legitimate interests balancing test.
The activities in the annexe are all of a public interest nature, for example, processing of data where necessary to prevent crime, safeguarding national security, protecting children, responding to emergencies or promoting democratic engagement. They also include situations where a public body requests a non-public body to share personal data with it to help deliver a public task sanctioned by law.
The clause was introduced as a result of stakeholders’ concerns raised in response to the public consultation Data: A New Direction in 2021. Some informed us that they were worried about the legal consequences of getting the balancing test in Article 6(1)(f) wrong. Others said that undertaking the balancing test can lead to delays in some important processing activities taking place.
As noble Lords will be aware, many data controllers have important roles in supporting activities that have a public interest nature. It is vital that data is shared without delay where necessary in areas such as safeguarding, prevention of crime and responding to emergencies. Of course, controllers who share data while relying on this new lawful ground would still have to comply with wider requirements of data protection legislation where relevant, such as data protection principles which ensure that the data is used fairly, lawfully and transparently, and is collected and used for specific purposes.
In addition to creating a new lawful ground of recognised legitimate interests, Clause 5 also clarifies the types of processing activities that may be permitted under the existing legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR. Even if a processing activity does not appear on the new list of recognised legitimate interests, data controllers may still have grounds for processing people’s data without consent if their interests in processing the data are not outweighed by the rights and freedoms that people have in relation to privacy. Clause 5(9) and (10) makes it clear this might be the case in relation to many common commercial activities, such as intragroup transfers.
My Lords, may I just revisit that with the Minister? I fear that he is going to move on to another subject. The Delegated Powers Committee said that it thought that the Government had not provided strong enough reasons for needing this power. The public interest list being proposed, which the Minister outlined, is quite broad, so it is hard to imagine the Government wanting something not already listed. I therefore return to what the committee said. Normally, noble Lords like to listen to recommendations from such committees. There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.
I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government
“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]
Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.
Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.
Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.
All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.
My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.
Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.
But what is the test of necessity at that age?
The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.
As I recall, the ages are on the electoral register.
I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.
May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.
I would of course be very happy to continue to engage with the Electoral Commission.
We will continue to work with the ICO to make sure that it is familiar with the plans for commencement and that its plans for guidance fit into that. In parts of the UK where the voting age is 18 and the age of attainment is 16, it would be more difficult for candidates and parties to show that it was necessary or proportionate to process the personal data of 14 and 15 year-olds in reliance on the new lawful ground. In this context, creating an arbitrary distinction between children at or approaching voting age and adults may not be appropriate; in particular, many teenagers approaching voting age may be more politically engaged than some adults. These measures will give parties and candidates a clear lawful ground for engaging them in the process. Accepting this amendment would remove the benefits of greater ease of identification of a lawful ground for processing by elected representatives, candidates and registered political parties, which is designed to improve engagement with the electorate. I therefore hope that the noble Baroness, Lady Jones, will withdraw her amendment.
I now come to the clause stand part notice that would remove Clause 114, which gives the Secretary of State a power to make exceptions to the direct marketing rules for communications sent for the purposes of democratic engagement. As Clause 115 defines terms for the purposes of Clause 114, the noble Baroness, Lady Jones, is also seeking for that clause to be removed. Under the current law, many of the rules applying to electronic communications sent for commercial marketing apply to messages sent by registered political parties, elected representatives and others for the purposes of democratic engagement. It is conceivable that, after considering the risks and benefits, a future Government might want to treat communications sent for the purposes of democratic engagement differently from commercial marketing. For example, in areas where voter turnout is particularly low or there is a need to increase engagement with the electoral process, a future Government might decide that the direct marketing rules should be modified. This clause stand part notice would remove that option.
We have incorporated several safeguards that must be met prior to regulations being laid under this clause. They include the Secretary of State having specific regard to the effect the exceptions could have on an individual’s privacy; a requirement to consult the Information Commissioner and other interested parties, as the Secretary of State considers appropriate; and the regulations being subject to parliamentary approval via the affirmative procedure.
For these reasons, I hope that the noble Baroness will agree to withdraw or not press her amendments.
My Lords, I am pleased that I have sparked such a lively debate. When I tabled these amendments, it was only me and the noble Lord, Lord Clement-Jones, so I thought, “This could be a bit sad, really”, but it has not been. Actually, it has been an excellent debate and we have identified some really good issues.
As a number of noble Lords said, the expression “democratic engagement” is weasel words: what is not to like about democratic engagement? We all like it. Only when you drill down into the proposals do you realise the traps that could befall us. As noble Lords and the noble Baroness, Lady Bennett, rightly said, we have to see this in the context of some of the other moves the Government are pursuing in trying to skew the electoral rules in their favour. I am not convinced that this is as saintly as the Government are trying to pretend.
The noble Baroness, Lady Harding, is absolutely right: this is about trust. It is about us setting an example. Of all the things we can do on data protection that we have control over, we could at least show the electorate how things could be done, so that they realise that we, as politicians, understand how precious their data is and that we do not want to misuse it.
I hope we have all knocked on doors, and I must say that I have never had a problem engaging with the electorate, and actually they have never had a problem engaging with us. This is not filling a gap that anybody has identified. We are all out there and finding ways of communicating that, by and large, I would say the electorate finds perfectly acceptable. People talk to us, and they get the briefings through the door. That is what they expect an election campaign to be about. They do not expect, as the noble Baroness, Lady Harding, said, to go to see their MP about one thing and then suddenly find that they are being sent information about something completely different or that assumptions are being made about them which were never the intention when they gave the information in the first place. I just feel that there is something slightly seedy about all this. I am sorry that the Minister did not pick up a little more on our concerns about all this.
There are some practical things that I think it was helpful for us to have talked about, such as the Electoral Commission. I do not think that it has been involved up to now. I would like to know in more detail what its views are on all this. It is also important that we come back to the Information Commissioner and check in more detail what his view is on all this. It would be nice to have guidance, but I do not think that that will be enough to satisfy us in terms of how we proceed with these amendments.
The Minister ultimately has not explained why this has been introduced at this late stage. He is talking about this as though conceivably, in the future, a Government might want to adopt these rules. If that is the case, I respectfully say that we should come back at that time with a proper set of proposals that go right through the democratic process that we have here in Parliament, scrutinise it properly and make a decision then, rather than being bounced into something at a very late stage.
I have to say that I am deeply unhappy at what the Minister has said. I will obviously look at Hansard, but I may well want to return to this.
My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.
I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.
In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.
Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.
Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.
Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.
The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.
Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.
My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:
“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.
I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.
Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.
I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.
My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.
On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?
On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?
I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?
Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.
I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.
I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?
The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.
The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.
I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.
A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.
There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.
Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as
“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.
The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.
Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.
In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:
“The new parameters are not intended to be reasons for refusal”,
but rather to give
“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]
But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.
We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a
“reasonable and proportionate search for the personal data and other information”.
This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.
Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate
“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.
We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.
We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.
As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.
I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?
First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.
Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.
But the two terms have a different legal meaning, surely.
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.
I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.
It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.
The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.
I am not quite sure what is being requested because the impact assessment has been not only made but published.
I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.
That begs the question of where on earth the rest is coming from.
Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.
I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.
Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.
On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—
My apologies. The Minister just said that the Government do not collect the data. Therefore, what is the basis for changing the threshold? No data, no reasonable case.
The Government do not collect details of private interactions between those raising SARs and the companies they raise them with. The business case is based on extensive consultation—
I hope that the Government have some data about government departments and the public bodies over which they have influence. Can he provide us with a glimpse of how many requests are received, how many are rejected at the outset, how many go to the commissioners, what the cost is and how the cost is computed? At the moment, it sounds like the Government want to lower the threshold without any justification.
As I say, I do not accept that the threshold is being lowered. On the other hand, I will undertake to find out what information can be reasonably provided. Again, as I said, the independent regulatory committee gave the business case set out a green rating; that is a high standard and gives credibility to the business case calculations, which I will share.
The reforms keep reasonable requests free of charge and instead seek to ensure that controllers can refuse or charge a reasonable fee for requests that are “vexatious or excessive”, which can consume a significant amount of time and resources. However, the scope of the current provision is unclear and, as I said, there are a variety of circumstances where controllers would benefit from being able confidently to refuse or charge the fee.
The Minister used the phrase “reasonable fee”. Can he provide some clues on that, especially for the people who may request information? We have around 17.8 million individuals living on less than £12,570. So, from what perspective is the fee reasonable and how is it determined?
“Reasonable” would be set out in the guidance to be created by the ICO but it would need to reflect the costs and affordability. The right of access remains of paramount importance in the data protection framework.
Lastly, as I said before on EU data adequacy, the Government maintain an ongoing dialogue with the EU and believe that our reforms are compatible with maintaining our data adequacy decisions.
For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore agree to withdraw or not press them.
My Lords, I can also be relatively brief. I thank all noble Lords who have spoken and the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, for their amendments, to many of which I have added my name.
At the heart of this debate is what constitutes a disproportionate or impossibility exemption for providing data to individuals when the data is not collected directly from data subjects. Amendments 29 to 33 provide further clarity on how exemptions on the grounds of disproportionate effort should be interpreted —for example, by taking into account whether there would be a limited impact on individuals, whether they would be caused any distress, what the exemptions were in the first place and whether the information had been made publicly available by a public body. All these provide some helpful context, which I hope the Minister will take on board.
I have also added my name to Amendments 27 and 28 from the noble Baroness, Lady Harding. They address the particular concerns about those using the open electoral register for direct marketing purposes. As the noble Baroness explained, the need for this amendment arises from the legal ruling that companies using the OER must first notify individuals at their postal addresses whenever their data is being used. As has been said, given that individuals already have an opt-out when they register on the electoral roll, it would seem unnecessary and impractical for companies using the register to follow up with individuals each time they want to access their data. These amendments seek to close that loophole and return the arrangements back to the previous incarnation, which seemed to work well.
All the amendments provide useful forms of words but, as the noble Baroness, Lady Harding, said, if the wording is not quite right, we hope that the Minister will help us to craft something that is right and that solves the problem. I hope that he agrees that there is a useful job of work to be done on this and that he provides some guidance on how to go about it.
I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.
I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.
I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.
We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.
These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.
The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.
The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.
I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?
On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.
My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.
Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.
On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.
Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.
The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.
Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.
My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.
I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.
As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.
The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.
The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.
My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?
That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.
The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.
I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.
Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.
Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.
I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.
The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.
The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.
However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.
At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.
Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.
Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.
Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.
It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.
In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.
Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.
Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.
Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.
I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?
I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.
So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.
I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.
The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.
I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.