Satellites: Adverse Effects on Astronomy

Lord Clement-Jones Excerpts
Wednesday 20th November 2024

(3 days, 9 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The cost of launch has come down by something like 95%. The UK remains committed to getting a launch and remains committed to the space strategy as laid out.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, in that National Space Strategy, the previous Government focused on encouraging lower earth orbit satellites, which are increasingly contributing to the loss of dark skies, as we have heard. Will this Government focus on incentives for the development of higher-orbit satellites, such as geostationary satellites, particularly the micro versions, of which far fewer are needed? They offer the best cost economics, compared to LEO systems, and have a lower impact on the night sky.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The noble Lord makes an extremely important point about the size of satellites, which is one of the problems with the interference from both radio and optical imaging. The smaller satellites, which the UK is extremely good at making, will become an increasing part of the solution. On orbit, we have a commitment to low orbit through the OneWeb approach—where there are about 700 in low orbit—and to higher orbit where it is appropriate to do so.

Specialised Research Units: Closures

Lord Clement-Jones Excerpts
Monday 28th October 2024

(3 weeks, 5 days ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

The noble Lord knows that I know that unit extremely well. It is a very important unit globally and it was given an award of £30 million recently. The new model will allow for a longer period of funding—seven years plus seven years’ funding, so a total of 14 years—with a different process of evaluation, which is a lighter-touch, less bureaucratic process. There is no reason why there cannot be a similar number of trainees going through the new system.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I declare an interest as chair of a university governing council. To some extent the Minister’s responses are reassuring, but is this part of a wider trend towards centralising decisions on research funding through UKRI? Are we moving towards a situation where the Government will fund research only within particular sectors set out in their industrial strategy? If that is the case, will that not stifle new research talent and innovation?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

As the noble Lord may be aware, I have been very clear about the need for supporting basic curiosity-driven, investigator-led research, and I will remain resolute in that determination. Some of these new centres have specified areas, such as mental health and multi-morbidity, but there is a whole round which is unspecified, allowing for people to put forward ideas of their own for units of the future, which I believe will be important for the very reason the noble Lord says.

King’s Speech (4th Day)

Lord Clement-Jones Excerpts
Monday 22nd July 2024

(4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I refer to my interests in the register. I join in congratulating all the new Government Ministers and Whips on their appointments. As the DSIT spokesperson on these Benches, I give a particularly warm welcome to the noble Lord, Lord Vallance of Balham, and his excellent maiden speech. While he was the Government’s Chief Scientific Adviser, he was pivotal in setting up the Vaccine Taskforce and in organising the overall strategy for the UK’s development and distribution of Covid-19 vaccines, and we should all be eternally grateful for that.

I warmly welcome the noble Baroness, Lady Jones of Whitchurch, to her role. We have worked well together outside and then inside this House, and I very much want to constructively engage with both Ministers on the Government’s science and technology agenda. I also thank the noble Viscount, Lord Camrose, for his engagement when in the department, and for his courtesy and good humour throughout.

I welcome the Government’s agenda for growth through innovation, their mission to enhance public services through the deployment of new technology and DSIT’s central role in that, opening up what can be a blocked pipeline all the way from R&D to commercialisation—from university spin-out through start-up to scale-up and IPO. Crowding in and de-risking private investment through the national wealth fund, the British Business Bank and post-Mansion House pension reforms is crucial. Digital skills and digital literacy are also crucial but, to deploy digital tools successfully, we need a pipeline of creative critical thinking and collaboration skills as well.

In this context, I very much welcome the new Government’s tone on the value of universities, long-term financial settlements and resetting relations with Europe. I hope this means that we shall soon see whether spending plans for government R&D expenditure by 2030 and 2035 match their words. Disproportionately high overseas researcher visa costs must be lowered, as the noble Lord, Lord Vallance, knows.

But support for innovation should not be unconditional or at any cost, and I hope this Government will not fall into the trap of viewing regulation as necessarily the enemy of innovation. I therefore hope that the reference to AI legislation, but the failure to announce a Bill, is a mere timing issue. Perhaps we can hear later what the Government’s intention is in this respect. Before then, we are promised a product safety and metrology Bill, which could require alignment of AI-driven products with the EU AI Act. This seems to be putting the cart well in front of the regulatory horse.

We need to ensure that high-risk systems are mandated to adopt international ethical and safety standards. We all need to establish very clearly that generative AI systems need licences to ingest copyright material for training purposes, just as Mumsnet and the New York Times are asserting, and that there is an obligation of transparency in the use of datasets and original content. The Government in particular should lead the way in ensuring that there is a high level of transparency and opportunity for redress when algorithmic and automated systems are used in the public sector, and I commend my forthcoming Private Member’s Bill to them.

As regards the Bills in the King’s Speech, I look forward to seeing the details, but the digital information and smart data Bill seems to be heading in the right direction in the areas covered. I hope that other than a few clarifications, especially in research and on the constitution of the Information Commissioner’s Office, we are not going to exhume some of the worst areas of the old DPDI Bill, and that we have ditched the idea of a Brexit-EU divergence dividend by the watering down of so many data subjects’ rights. Will the Government give a firm commitment to safeguard our data adequacy with the EU? I hope that they will confirm that the intent of the reinstated digital verification provisions is not to have some form of compulsory national digital ID, but the creation of a genuine market in digital ID providers that give a choice to the citizen. I hope also that, in the course of that Bill, Ministers will meet LinesearchbeforeUdig and provide us all with much greater clarity around the proposals for the national underground asset register.

As for the cyber security and resilience Bill, events of recent days have demonstrated the need for cybersecurity, but have also made it clear that we are not just talking about threats from bad actors. There needs to be a rethink on critical national infrastructure such as cloud services and software, which are now essential public utilities.

Finally, I hope that we will see a long-awaited amendment of the Computer Misuse Act to include a statutory public defence, as called for by CyberUp, which was recommended by the Vallance report, as I recall. I very much hope that there will be no more Horizon scandals. I look forward to the Minister’s reply.

Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

Many of your Lordships will be familiar with the arguments we have had on the Bill. The important point to stress is that there has been a general welcome of this legislation. I would also like to stress that a measure of cross-party co-operation was the hallmark of the scrutiny of the Bill during its passage through your Lordships’ House. Ministers and officials have given their time generously in meetings and have responded promptly and helpfully to the issues that scrutiny has thrown up.

At the heart of the Bill is the regulation of the internet in a way that should prevent market abuse, in particular by big tech. Helpful though the Government have been, they have not provided answers to some important questions, hence amendments being passing on Report. These have been sent back to us by the House of Commons without the Government—save in one respect—making concessions.

One of the areas that gave noble Lords particular concern is the inclusion of amendments in the House of Commons at a late stage, following lobbying of the Government by big tech. A prospective intervention by the regulator is unlikely to be welcomed by big tech companies and, given their enormous legal budgets, will inevitably be challenged. The change of wording from “appropriate” to “proportionate” will make such challenges easier. A reversion to the Bill’s original wording will help to restore balance, and it is hoped that the amendments in my name and those in the name of the noble Baroness, Lady Jones, on appeals against interventions, will achieve that. Our amendments on Motion C are intended to prevent a seepage of arguments on penalty, which involves a merits test, into the judicial review test, which applies to the intervention itself.

Why have the Government made this late change of “appropriate” to “proportionate”? They have been rather coy about this. There has been some waffle—I am afraid I must describe it as such—about increased clarity and the need for a regulator to act in a proportionate manner. That is quite so but, on further probing, the reasoning was revealed: it is intended to reflect the level of challenge derived from jurisprudence from the European Court of Human Rights and the CJEU, where human rights issues are engaged. I remain bewildered as to why big tech has human rights. This is not what the framers of the convention had in mind.

But if—and it is a big “if”—a convention right is engaged, proportionality is the test, or at least part of it. This is a much lower bar than the normal judicial review test. If the Bill remains unamended, this lower bar will apply to challenges whether or not a convention right is engaged. This is good news for big tech and its lawyers, but not for the Bill and its primary purpose.

I ask the Minister this specific question: if the convention right is engaged, proportionality comes into the analysis anyway, but what if a court were to decide that A1P1—the relevant “human right”—was not engaged? With the Bill unamended, proportionality would apply to a non-convention case, greatly to the advantage of big tech. Is my understanding correct?

It seems that big tech has got its way and that litigation wars can commence—a great pity, most specifically for the smaller players and for the ostensible rationale behind the legislation.

On Motion C1, the test for appeals on penalty is to be a merits-based one, rather than the higher bar that a judicial review standard would, or should, involve. The amendments before your Lordships’ House are intended to prevent seepage from one test to another. His Majesty’s Government say that the courts are well used, in different contexts, to applying different tests as part of an analysis. This is true—in theory. My concern is that if I were advising Meta or Google about an intervention and a consequent hefty fine—this is not an advertisement—it is inevitable that I would advise in favour of appealing both aspects of the intervention: against conviction and sentence, as it were.

It is relatively easy to insulate arguments in criminal cases. One question is, was the conviction unsafe? Another is, was the sentence too long? In the emerging world of internet regulation, however, it is likely to be far more difficult in practice. The question of whether an intervention was disproportionate—disproportionate to what?—will inevitably be closely allied to that of whether the penalty was excessive or disproportionate: another win for big tech, and a successful piece of lobbying on its part.

I look forward to words of reassurance from the Minister. In the meantime, I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I will speak to Motion B1 and briefly in support of other motions in this group.

Last December, at Second Reading, I said that we on these Benches want to see the Bill and the new competition and consumer powers make a real difference, but that they can do so only with some key changes. On Third Reading, I pointed out that we were already seeing big tech take an aggressive approach to the EU’s Digital Markets Act, and we therefore believed that the Bill needed to be more robust and that it was essential to retain the four key competition amendments passed on Report. That remains our position, and I echo the words of the noble Lord, Lord Faulks: that the degree of cross-party agreement has been quite exemplary.

As we heard on Report, noble Lords made four crucial amendments to Part 1 of the digital markets Bill: first, an amendment whereby, when the Competition and Markets Authority seeks approval of its guidance, the Secretary of State is required within 40 days to approve the guidance or to refuse to approve it and refer it back to the CMA; secondly, an amendment reverting the countervailing benefits exemption to the version originally in the Bill, which included the “indispensable” standard; thirdly, amendments reverting the requirement for the CMA’s conduct requirement and pro-competitive interventions to be “proportionate” back to “appropriate”; and fourthly, amendments reverting the appeals standard to judicial review for penalties.

We welcome the fact that the Government have proposed, through Motion D, Amendment 38A in lieu, which effectively achieves the same aims, ensuring that the approval of the CMA guidance by the Secretary of State does not unduly hold up the operationalisation of the new regime. However, the Government’s Motions A, B and C disagree with the other Lords amendments.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to the debate today and, of course, throughout the development of this legislation. It has been a characteristically brilliant debate; I want to thank all noble Lords for their various and valuable views.

I turn first to the Motions tabled by the noble Lord, Lord Faulks, in relation to appeals and proportionality. I thank him for his continued engagement and constructive debate on these issues. We of course expect the CMA to behave in a proportionate manner at all times as it operates the digital market regime. However, today we are considering specifically the statutory requirement for proportionality in the Bill. We are making it clear that the DMU must design conduct requirements and PCIs to place as little burden as possible on firms, while still effectively addressing competition issues. The proposed amendments would not remove the reference to proportionality in Clause 21 and so, we feel, do not achieve their intended aim, but I shall set out the Government’s position on why proportionality is required.

On the question of the wording of “appropriate” versus “proportionate”, proportionality is a well-understood and precedented concept with a long history of case law. “Appropriate” would be a more subjective threshold, giving the CMA broader discretion. The Government’s position is that proportionality is the right threshold to be met in legislation due to the fact that it applies, in the vast majority of cases, because of ECHR considerations. It is the Government’s view that the same requirement for proportionality should apply whether or not ECHR rights are engaged.

As Article 1 of Protocol 1—A1P1—of the European Convention on Human Rights will apply to the vast majority of conduct requirements and PCIs imposed by the CMA, with the result that the courts will apply a proportionality requirement, we consider it important that it should be explicit that there is a statutory proportionality requirement for all conduct requirements and PCIs. We believe that proportionality should be considered beyond just those cases where A1P1 may apply, in particular when a conduct requirement or PCI would impact future contracts of an SMS firm.

The courts’ approach to proportionality in relation to consideration of ECHR rights has been set out by the Supreme Court, and we do not expect them to take a different approach here. Furthermore, the CAT will accord respect to the expert judgments of the regulator and will not seek to overturn its judgments lightly. I hope this answers the question put by the noble Lord, Lord Faulks.

On appeals, I thank noble Lords for their engagement on this matter, and in particular the noble Baroness, Lady Jones of Whitchurch, for setting out the rationale for her Amendments 32B and 32C, which seek to provide further clarity about where on the merits appeals apply. I want to be clear that the Government’s intention is that only penalty decisions will be appealable on the merits and that this should not extend to earlier decisions about whether an infringement occurred. I do not consider these amendments necessary, for the following reasons.

The Bill draws a clear distinction between penalty decisions and those about infringements, with these being covered by separate Clauses 89 and 103. There is a Court of Appeal precedent in BCL v BASF 2009 that, in considering a similar competition framework, draws a clear distinction between infringement decisions and penalty decisions. The Government consider that the CAT and the higher courts will have no difficulty in making this distinction for digital markets appeals to give effect to the legislation as drafted.

I now turn to the Motion tabled by the noble Lord, Lord Clement-Jones, in respect of the countervailing benefits exemption. I thank the noble Lord for his engagement with me and the Bill team on this important topic. The noble Lord has asked for clarification that the “indispensability” standard in Section 9 of the Competition Act 1998, and the wording,

“those benefits could not be realised without the conduct”,

are equivalent to each other. I want to be clear that the exemption within this regime and the exemption in Section 9 of the Competition Act 1998 are different. This is because they operate in wholly different contexts, with different criteria and processes. This would be the case however the exemption is worded in this Bill. That is why the Explanatory Notes refer to a “similar” exemption, because saying it is “equivalent” would be technically incorrect.

Having said that, the “indispensability” standard and the threshold of the Government’s wording,

“those benefits could not be realised without the conduct”,

are equally high. While the exemptions themselves are different, I hope I can reassure noble Lords that the Government’s view is that the standard—the height of the threshold—is, indeed, equivalent. The Government still believe that the clarity provided by simplifying the language provides greater certainty to all businesses, while ensuring that consumers get the best outcomes.

I thank the noble Lord, Lord Clement-Jones, for his question in relation to the Google privacy sandbox case. The CMA considers a range of consumer benefits under its existing consumer objective. This can include the privacy of consumers. It worked closely with the ICO to assess data privacy concerns in its Google privacy sandbox investigation and we expect it would take a similar approach under this regime.

I urge all noble Lords to consider carefully the Motions put forward by the Government and hope all Members will feel able—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Will the Government update the Explanatory Notes?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. In principle I am very happy to update the Explanatory Notes, but I need to engage with ministerial colleagues. However, I see no reason why that would not be possible.

Meanwhile, I hope all noble Lords will feel able to support the Government’s position.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I have already spoken to Motion B. I beg to move.

Motion B1 (as an amendment to Motion B)

Lord Clement-Jones Portrait Lord Clement-Jones
- View Speech - Hansard - -

Tabled by

Leave out from “House” to end and insert “do not insist on its Amendment 12, to which the Commons have disagreed for their Reason 13A, and do insist on its Amendment 13.”

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, if this is not a non-parliamentary expression, I will say that the Minister has come within a gnat’s whisker of where we need to be. I rely on his assurances about Explanatory Notes, because they will be important, but I do not move Motion B1.

Motion B1 not moved.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support Motion E1 and pay fulsome tribute to the noble Lord, Lord Moynihan, for his expertise and tenacity. Thanks to his efforts and those of Sharon Hodgson MP, and after a long campaign with the All-Party Group on Ticket Abuse, we were able to include certain consumer protections in the ticketing market in the Consumer Rights Act 2015. The noble Lord’s amendment on Report sought to introduce additional regulatory requirements on secondary ticketing sites for proof of purchase, ticket limits and the provision of information on the face of tickets. That would have secured greater protection for consumers and avoided market exploitation, which is currently exponentially growing on platforms such as viagogo.

As we have heard, the Ministers—the noble Lord, Lord Offord, and the noble Viscount, Lord Camrose—in their letter of 1 May to noble Lords, offered a review that would take place over nine months, which would make recommendations for Ministers to consider. But that is simply not enough, as the noble Lord, Lord Moynihan, has demonstrated. The Minister, the noble Lord, Lord Offord, seems to believe from his own experience—unlike the rest of us—that everything is fine with the secondary market and that the answer to any problem lies in the hands of the primary ticket sellers. However, the noble Lord, Lord Moynihan, in his brilliantly expert way, demonstrated extremely cogently how that is absolutely not the case for the Minister’s favourite sports of rugby and football, where the secondary resellers are flagrantly breaking the law.

Deepfakes: General Election

Lord Clement-Jones Excerpts
Wednesday 8th May 2024

(6 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Indeed—and let me first thank my noble friend for bringing up this important matter. That sounds to me like something that would be likely to be applied under the false communications offence in the Online Safety Act—Section 179—although I would not be able to say for sure. The tests that it would need to meet are that the information would have to be knowingly false and cause non-trivial physical or psychological harm to those offended, but that would seem to be the relevant offence.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, does not the Question from the noble Baroness, Lady Jones, highlight that we must hold to account with legal liability not only those who create this kind of deepfake content and facilitate its spread, but those who enable the production of deepfakes with software, such as by having standards and risk-based regulation for generative AI systems, which the Government in their White Paper have resolutely refused to do?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

The Government set out in their White Paper response that off-the-shelf AI software that can in part be used to create these kinds of deepfakes is not, in and of itself, something that we are considering placing any ban on. However, there are ranges of software, a sort of middle layer to the AI production, that can greatly facilitate the production of deepfakes of all kinds, not just political but other kinds of criminal deepfakes—and there the Government would be actively considering moving against those purpose-built criminal tools.

UK Finance also emphasises that the one-off aspect of these proposals is bad. It is unhappy that it is a one-off; this should be part of an overall strategy to deal with fraud and financial misunderstandings within the sector. Just picking it off as one particular aspect, when it is a much wider issue, is a matter of concern to it. It is also concerned—perhaps this is something I would urge my noble friend to think about when we come back to this issue on Report, which I am sure we will—that charities and social organisations that represent people who are less able because of income or background to cope with these issues will be involved in the consultation on this code of practice. I am totally in favour of my noble friend’s proposals, but I suggest that consultation needs to go somewhat wider than the list in the amendment.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it has been a privilege to be at the ringside during these three groups. I think the noble Baroness, Lady Sherlock, is well ahead on points and that, when we last left the Minister, he was on the ropes, so I hope that to avoid the knock- out he comes up with some pretty good responses today, especially as we have been lucky enough to have the pleasure of reading Hansard between the second and third groups. I think the best phrase that noble Baroness had was the “astonishing breadth” of Clause 128 and Schedule 11 that we explored with horror last time. I very much support what she says.

The current provisions seem to make the code non-mandatory, yet we discovered they are without “reasonable suspicion”, the words that are in the national security legislation—fancy having the Home Office as our model in these circumstances. Does that not put the DWP to shame? If we have to base best practice on the Home Office, we are in deep trouble.

That aside, we talked about “filtering” and “signals” last time. The Minister used that phrase twice, I think, and we discovered about “test and learn”. Will all that be included in the code?

All this points to the fragility and breadth of this schedule. It has been dreamt up in an extraordinarily expansive way without considering all the points that the noble Lord, Lord Anderson, has mentioned, including the KC’s opinion, all of which point to the fact that this schedule is going to infringe Article 8 of the European Convention on Human Rights. I hope the Minister comes up with some pretty good arguments.

My final question relates to the impact assessment–or non-impact assessment. The Minister talked about the estimate of DWP fraud, which is £6.4 billion. What does the DWP estimate it will be after these powers are implemented, if they are ever implemented? Should we not have an idea of the DWP’s ambitions in this respect?

Viscount Younger of Leckie Portrait The Parliamentary Under-Secretary of State, Department for Work and Pensions (Viscount Younger of Leckie) (Con)
- Hansard - - - Excerpts

My Lords, this has been a somewhat shorter debate than we have been used to, bearing in mind Monday’s experience. As with the first two groups debated then, many contributions have been made today and I will of course aim to answer as many questions as I can. I should say that, on this group, the Committee is primarily focusing on the amendments brought forward by the noble Baroness, Lady Sherlock, and I will certainly do my very best to answer her questions.

From the debate that we have had on this measure, I believe that there is agreement in the Committee that we must do more to clamp down on benefit fraud. That is surely something on which we can agree. In 2022-23, £8.3 billion was overpaid due to fraud and error in the benefit system. We must tackle fraud and error and ensure that benefits are paid to those genuinely entitled to the help. These powers are key to ensuring that we can do this.

I will start by answering a question raised by the noble Lord, Lord Anderson—I welcome him to the Committee for the first time today. He described himself as a “surveillance nerd”, but perhaps I can entreat him to rename himself a “data-gathering nerd”. As I said on Monday, this is not a surveillance power and suggesting that it is simply causes unnecessary worry. This is a power that enables better data gathering; it is not a surveillance or investigation power.

The third-party data measure does not allow the DWP to see how claimants spend their money, nor does it give the DWP access to millions of people’s bank accounts, as has been inaccurately presented. When the DWP examines the data that it receives from third parties, this data may suggest that there is fraud or error and require a further review. This will be done through our normal, regular, business-as-usual processes to determine whether incorrect payments are indeed being made. This approach is not new. As alluded to in this debate, through the Finance Act 2011, Parliament has already determined that this type of power is proportionate and appropriate, as HMRC already owns similar powers regarding banking institutions and third parties in relation to all taxpayers.

I listened very carefully to the noble Lord and will, however, take back his points and refer again to our own legal team. I think the point was made about the legality of all this. It is a very important point that he has made with all his experience, and I will take it back and reflect on it.

--- Later in debate ---
Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the Minister extolled the green-rated nature of this impact assessment. In the midst of all that, did he answer my question?

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I need to be reminded of the question.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I asked about the amount of fraud that the Government plan to detect, on top of the £6.4 billion in welfare overpayments that was detected last year.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

The figure that we have is £600 million but, again, I will reflect on the actual question that we are looking to address—the actual amount of fraud in the system.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister is saying that that figure is not to be found in this green-rated impact assessment, which most of us find to be completely opaque.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I will certainly take that back, but it is green rated.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have talked about proportionality and disproportionality throughout the debate on this Bill. Is it not extraordinary that that figure is not on the table, given the extent of these powers?

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, the Minister was kind enough to mention me a little earlier. Can I just follow up on that? In the impact assessment, which I have here, nowhere can I find the £600 million figure, nor can I find anywhere the costs related to this. There will be a burden on the banks and clearly quite a burden on the DWP, actually, if it has got to trawl through this information, as the noble Viscount says, using people rather than machines. The costs are going to be enormous to save, it would appear, up to £120 million per year out of £6.4 billion per year of fraud. It does seem odd. It would be really helpful to have those cost numbers and to understand in what document they are, because I cannot find in the impact assessment where these numbers are.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I hope I can help both noble Lords. Although I must admit that I have not read every single page, I understand that the figure of £500 million is in the IA.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Did the Minister say £500 million?

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

Yes, £500 million. I mentioned £600 million altogether; that was mentioned by the OBR, which had certified this, and by the way, that figure was in the Autumn Statement.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, has not that demonstrated the disproportionality of these measures?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The noble Viscount explained in response to the noble Lord, Lord Anderson, that at every stage where the powers are going to be expanded, it would come back as an affirmative regulation. I might have been a bit slow about this, but I have been having a look and I cannot see where it says that. Perhaps he could point that out to me, because that would provide some reassurance that each stage of this is coming back to us.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I understand, very quickly, that it is in paragraph 1(1), but again, in the interests of time, maybe we could talk about that outside the Room.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Could the Minister clarify: was that paragraph 1(1)?

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I can reassure the noble Lord that that is the case, yes.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

I reassure noble Lords that is correct—it is paragraph 1(1). It may be rather complex, but it is in there, just to reassure all noble Lords.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to keep coming back, but did the Minister give us the paragraph in the impact assessment that referred to £500 million?

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

No, I did not, but that is something which surely we can deal with outside the Room. However, I can assure noble Lords that it is in there.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I want briefly to contribute to this debate, which I think is somewhat less contentious than the previous group of amendments. As somebody, again, who was working on the Online Safety Act all the way through, I really just pay tribute to the tenacity of the noble Baroness, Lady Kidron, for pursuing this detail—it is a really important detail. We otherwise risk, having passed the legislation, ending up in scenarios where everyone would know that it was correct for the data-gathering powers to be implemented but, just because of the wording of the law, they would not kick in when it was necessary. I therefore really want to thank the noble Baroness, Lady Kidron, for being persistent with it, and I congratulate the Government on recognising that, when there is an irresistible force, it is better to be a movable object than an immovable one.

I credit the noble Viscount the Minister for tabling these amendments today. As I say, I think that this is something that can pass more quickly because there is broad agreement around the Committee that this is necessary. It will not take away the pain of families who are in those circumstances, but it will certainly help coroners get to the truth when a tragic incident has occurred, whatever the nature of that tragic incident.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, having been involved in and seen the campaigning of the bereaved families and the noble Baroness, Lady Kidron, in particular in the Joint Committee on the Draft Online Safety Bill onwards, I associate myself entirely with the noble Baroness’s statement and with my noble friend Lord Allan’s remarks.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.

Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words

“or are due to conduct an investigation”

are indeed superfluous.

We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.

There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.

Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will be brief because we very much support these amendments. Interestingly, Amendment 239 from the noble Baroness, Lady Jones, follows closely on from a Private Member’s Bill presented in November 2021 by the Minister’s colleague, Minister Saqib Bhatti, and before that by the right honourable Andrew Mitchell, who is also currently a Minister. The provenance of this is impeccable, so I hope that the Minister will accept Amendment 239 with alacrity.

We very much support Amendment 250. The UK Commission on Bereavement’s Bereavement is Everyone’s Business is a terrific report. We welcome Clause 133 but we think that improvements can be made. The amendment from the noble Baroness, which I have signed, will address two of the three recommendations that the report made on the Tell Us Once service. It said that there should be a review, which this amendment reflects. It also said that

“regulators must make sure bereaved customers are treated fairly and sensitively”

by developing minimum standards. We very much support that. It is fundamentally a useful service but, as the report shows, it can clearly be improved. I congratulate the noble Baroness, Lady Jones, on picking up the recommendations of the commission and putting them forward as amendments to this Bill.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I declare an interest as someone who has been through the paper death registration process and grant of probate, which has something to do with why I am in your Lordships’ House, so I absolutely understand where the noble Baroness, Lady Jones of Whitchurch, is coming from. I thank her for tabling these amendments to Clauses 133 and 142. They would require the Secretary of State to commission a review with a view to creating a single digital register for the registration of births and deaths and to conduct a review of the Government’s Tell Us Once scheme.

Clause 133 reforms how births and deaths are registered in England and Wales by enabling a move from a paper-based system of birth and death registration to registration in a single electronic register. An electronic register is already in use alongside the paper registers and has been since 2009. Well-established safety and security measures and processes are already in place with regard to the electronic infrastructure, which have proven extremely secure in practice. I assure noble Lords that an impact assessment has been completed to consider all the impacts relating to the move to an electronic register, although it should be noted that marriages and civil partnerships are already registered electronically.

The strategic direction is to progressively reduce the reliance on paper and the amount of paper in use, as it is insecure and capable of being tampered with or forged. The creation of a single electronic register will remove the risk of registrars having to transmit loose-leaf register pages back to the register office when they are registering births and deaths at service points across the district. It will also minimise the risk of open paper registers being stolen from register offices.

The Covid-19 pandemic had unprecedented impacts on the delivery of registration services across England and Wales, and it highlighted the need to offer more choice in how births and deaths are registered in the future. The provisions in the Bill will allow for more flexibility in how births and deaths are registered—for example, registering deaths by telephone, as was the case during the pandemic. Over 1 million deaths were successfully registered under provisions in the Coronavirus Act 2020. This service was well received by the public, registrars and funeral services.

Measures will be put in place to ensure that the identity of an informant is established in line with Cabinet Office good practice guidance. This will ensure that information provided by informants can be verified or validated for the purposes of registering by telephone. For example, a medical certificate of cause of death issued by a registered medical practitioner would need to have been received by the registrar before an informant could register a death by telephone. Having to conduct a review, as was proposed by the noble Baroness, Lady Jones, would delay moving to digital ways of working and the benefits this would introduce.

--- Later in debate ---
For these reasons, I am not able to accept these amendments. I hope the noble Lord will therefore not press them. I beg to move Amendment 240.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his exposition. He explained the purposes of Clauses 138 to 141 and extolled their virtues, and helpfully explained what my amendments are trying to do—not that he has shot any foxes in the process.

The purpose of my amendments is much more fundamental, and that is to question the methodology of the Government in all of this. The purpose of NUAR is to prevent accidental strikes where building works damage underground infrastructure. However, the Government seem to have ignored the fact that an equivalent service—LinesearchbeforeUdig, or LSBUD—already achieves these aims, is much more widely used than NUAR and is much more cost effective. The existing system has been in place for more than 20 years and now includes data from more than 150 asset owners. It is used by 270,000 UK digging contractors and individuals—and more every day. The fact is that, without further consultation and greater alignment with current industry best practice, NUAR risks becoming a white elephant, undermining the safe working practices that have kept critical national infrastructure in the UK safe for more than two decades.

However, the essence of these amendments is not to cancel NUAR but to get NUAR and the Government to work much more closely with the services that already exist and those who wish to help. They are designed to ensure that proper consultation and democratic scrutiny is conducted before NUAR is implemented in statutory form. Essentially, the industry says that NUAR could be made much better and much quicker if it worked more closely with the private sector services that already exist. Those who are already involved with LinesearchbeforeUdig say, first of all, that NUAR will create uncertainty and reduce safety, failing in its key aims.

The Government have been developing the NUAR since 2018. Claiming that it would drive a reduction in unexpected underground assets being damaged in roadworks, the impact assessment incorrectly states:

“No businesses currently provide a service that is the same or similar to the service that NUAR would provide”.


In fact, as I said, LSBUD has been providing a safe digging service in the UK for 20 years and has grown significantly over that time. Without a plan to work more closely with LSBUD as the key industry representative, NUAR risks creating more accidental strikes of key network infrastructure, increasing risks to workers safety through electrical fires, gas leaks, pollution and so on. The public at home or at work would also suffer more service outages and disruption.

Secondly, NUAR will add costs and stifle competition. The Government claim that NUAR will deliver significant benefits to taxpayers, reduce disruption and prevent damage to underground assets, but the impact assessment ignores the fact that NUAR’s core functions are already provided through the current system—so its expected benefits are vastly overstated. While asset owners, many of whom have not been consulted, will face costs of more than £200 million over the first 10 years, the wholesale publication of asset owners’ entire networks creates commercially sensitive risks, damaging innovation and competition. Combined with the uncertainties about how quickly NUAR can gain a critical mass of users and data, this again calls into question why NUAR does not properly align with and build on the current system but instead smothers competition and harms a successful, growing UK business.

Thirdly, NUAR risks undermining control over sensitive CNI data. Underground assets are integral to critical national infrastructure; protecting them is vital to the UK’s economic and national security. LSBUD deliberately keeps data separate and ensures that data owners remain in full control over who can access their data via a secure exchange platform. NUAR, however, in aiming to provide a single view of all assets, removes providers’ control over their own data—an essential security fail-safe. It would also expand opportunities for malicious actors to target sectors in a variety of ways—for instance, the theft of copper wires from telecom networks.

NUAR shifts control over data access to a centralised government body, with no clear plan for how the data is to be protected from unauthorised access, leading to serious concerns about security and theft. Safe digging is paramount; mandating NUAR will lead to uncertainty, present more health and safety dangers to workers and the public and put critical national infrastructure at risk. These plans require further review. There needs to be, as I have said, greater alignment with industry best practice. Without further consultation, NUAR risks becoming a white elephant that undermines safe digging in the UK and increases risk to infrastructure workers and the public.

I will not go through the amendments individually as the Minister has mentioned what their effect would be, but I will dispel a few myths. The Government have claimed that NUAR has the overwhelming support of asset owners. In the view of those who briefed me, that is not an accurate reflection of the broadband and telecoms sector in particular; a number of concerns from ISPA members have been raised with the NUAR team around cost and security that have yet to be addressed. This is borne out by the fact that there are notable gaps in the major asset owners in the telecoms sector signed up to NUAR at this time.

Clearly, the noble Viscount is resisting changing the procedure by which these changes are made from negative to affirmative, but I hope I have gone some way to persuade the Committee of the importance of this change to how the NUAR system is put on a statutory footing. He talked about a “handful” of data; the comprehensive nature of the existing system is pretty impressive, and it is a free service, updated on a regular basis, which covers more than 150 asset owners and 98% of high-risk assets. NUAR currently covers only one-third of asset owners. The comparisons are already not to the advantage of NUAR.

I hope the Government will at least, even if they do not agree with these amendments, think twice before proceeding at the speed they seem to be and without the consent or taking on board the concerns of those who are already heavily engaged with Linesearch- beforeUdig who find it pretty satisfactory for their purposes.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, the Minister really did big up this section of the Bill. He said it would revolutionise this information service, that it would bring many benefits, has a green rating, would be the Formula 1 of data transfer in mapping and so on. We were led to expect quite a lot from this part of the legislation. It is an important part of the Bill, because it signifies some government progress towards the goal of creating a comprehensive national underground asset register, as he put it, or NUAR. We are happy to support this objective, but we have concerns about the progress being made and the time it is taking.

To digress a bit here, it took me back 50 years to when I was a labourer working by the side of a bypass. One of the guys I was working with was operating our post hole borer; it penetrated the Anglian Water system and sent a geyser some 20 metres up into the sky, completely destroying my midday retreat to the local pub between the arduous exercise of digging holes. Had he had one of the services on offer, I suspect that we would not have been so detained. It was quite an entertaining incident, but it clearly showed the dangers of not having good mapping.

As I understand it, and as was outlined by the noble Lord, Lord Clement-Jones, since 2018 the Government have been moving towards this notion of somewhere recording what lies below the surface in our communities. We have had street works legislation going back several decades, from at least 1991. In general, progress towards better co-ordination of utilities excavations has not been helped by poor and low levels of mapping and knowledge of what and which utilities are located underground. This is despite the various legislative attempts to make that happen, most of which have attempted to bring better co-ordination of services.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.

NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a

“sustainably-funded UK success story”—

okay, give or take a bit of puff—that

“responds to most requests in 5 minutes or less”.

It has

“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.

That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

All I can say is that, in that case, the Minister has been worked on extremely well.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.

--- Later in debate ---
The NUAR includes a number of safeguards to ensure that data is accessed only for permitted purposes under controlled conditions. This includes access controls, the ability of asset owners to flag particularly sensitive or critical data for redaction, and owners’ ability to specify additional safe working requirements for hazardous sites and assets, such as site supervision. These have been developed in collaboration with asset owners, security experts and the security services.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.

I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I congratulate the noble Baroness, Lady Kidron, on her amendment and thank her for allowing me to add my name to it. I agree with what she said. I, too, had the benefit of a meeting with the Lord Chancellor, which was most helpful. I am grateful to Mr Paul Marshall—whom the noble Baroness mentioned and who has represented several sub-postmasters in the Horizon scandal—for his help and advice in this matter.

My first short point is that evidence derived from a computer is hearsay. There is good reason for treating hearsay evidence with caution. Computer scientists know—although the general public do not—that only the smallest and least complex computer programs can be tested exhaustively. I am told that the limit for that testing is probably around 100 lines of a well-designed and carefully written program. Horizon, which Mr Justice Fraser said was not in the least robust, consisted of a suite of programs involving millions of lines of code. It will inevitably have contained thousands of errors because all computer programs do. Most computer errors do not routinely cause malfunctions. If they did, they would be spotted at an early stage and the program would be changed—but potentially with consequential changes to the program that might not be intended or spotted.

We are all aware of how frequently we are invited to accept software updates from our mobile telephone’s software manufacturers. Those updates are not limited to security chinks but are also required because bugs—or, as we learned yesterday from Paula Vennells’s husband, anomalies and exceptions—are inevitable in computer programs. That is why Fujitsu had an office dedicated not just to altering the sub-postmasters’ balances, shocking as that is, but to altering and amending a program that was never going to be perfect because no computer program is.

The only conclusion that one can draw from all this is that computer programs are, as the noble Baroness said, inherently unreliable, such that having a presumption in law that they are reliable is unsustainable. In the case of the DPP v McKeown and Jones—in 1997, I think—Lord Hoffmann said:

“It is notorious that one needs no expertise in electronics to be able to know whether a computer is working properly”.


One must always hesitate before questioning the wisdom of a man as clever as Lord Hoffmann, but he was wrong. The notoriety now attaches to his comment.

The consequences of the repeal of Section 69 of the Police and Criminal Evidence Act 1984 have been that it reduces the burden of proof, so that Seema Misra was sent to prison in the circumstances set out by the noble Baroness. Further, this matter is urgent for two reasons; they slightly conflict with each other, but I will nevertheless set them out. The first is that for the presumption to remain in place for one minute longer means that there is a genuine risk that miscarriages of justice will continue to occur in other non-Post Office cases, from as early as tomorrow. The second is that any defence lawyer will, in any event, be treating the presumption as having been fatally undermined by the Horizon issues. The presumption will therefore be questioned in every court where it might otherwise apply. It needs consideration by Parliament.

My noble friend the Minister will say, and he will be right, that the Horizon case was a disgraceful failure of disclosure by the Post Office. But it was permitted by the presumption of the correctness of computer evidence, which I hope we have shown is unsustainable. Part of the solution to the problem may lie in changes to disclosure and discovery, but we cannot permit a presumption that we know to be unfounded to continue in law.

My noble friend may also go on to say that our amendment is flawed in that it will place impossible burdens on prosecutors, requiring them to get constant certificates of proper working from Microsoft, Google, WhatsApp, and whatever Twitter is called nowadays. Again, he may be right. We do not seek to bring prosecutions grinding to a halt, nor do we seek to question the underlying integrity of our email or communications systems, so we may need another way through this problem. Luckily, my noble friend is a very clever man, and I look forward to hearing what he proposes.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have heard two extremely powerful speeches; I will follow in their wake but be very brief. For many years now, I campaigned on amending the Computer Misuse Act; the noble Lord, Lord Arbuthnot, did similarly. My motivation did not start with the Horizon scandal, but was more at large because of the underlying concerns about the nature of computer evidence.

I came rather late to this understanding about the presumption of the accuracy of computer evidence. It is somewhat horrifying, the more you look into the history of this, which has been so well set out by the noble Baroness, Lady Kidron. I remember advising MPs at the time about the Police and Criminal Evidence Act. I was not really aware of what the Law Commission had recommended in terms of getting rid of Section 69, or indeed what the Youth Justice and Criminal Evidence Act did in 1999, a year after I came into this House.

The noble Baroness has set out the history of it, and how badly wrong the Law Commission got this. She set out extremely well the impact and illustration of Mrs Misra’s case, the injustice that has resulted through the Horizon cases—indeed, not just through those cases, but through other areas—and the whole aspect of the reliability of computer evidence. Likewise, we must all pay tribute to the tireless campaigning of the noble Lord, Lord Arbuthnot. I thought it was really interesting how he described computer evidence as hearsay, because that essentially is what it is, and there is the whole issue of updates and bug fixing.

The one area that I am slightly uncertain about after listening to the debate and having read some of the background to this is precisely what impact Mr Justice Fraser’s judgment had. Some people seem to have taken it as simply saying that the computer evidence was unreliable, but that it was a one-off. It seems to me that it was much more sweeping than that and was really a rebuttal of the original view the Law Commission took on the reliability of computer evidence.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.

There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.

On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.

This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.

We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.

This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.

Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.

--- Later in debate ---
Moved by
254: Schedule 15, page 278, line 17, leave out “Secretary of State” and insert “person who chairs the relevant Parliamentary committee”
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am afraid that I will speak to every single one of the amendments in this group but one, which is in the name of the noble Baroness, Lady Jones, and I have signed it. We have already debated the Secretary of State’s powers in relation to what will be the commission, in setting strategic priorities for the commissioner under Clause 32 and recommending the adoption of the ICAO code of practice before it is submitted to Parliament for consideration under Clause 33:

“Codes of practice for processing personal data”.


We have also debated Clause 34:

“Codes of practice: panels and impact assessments”.


And we have debated Clause 35:

“Codes of Practice: Secretary of States recommendations”.


The Secretary of State has considerable power in relation to the new commission, and then on top of that Clause 143 and Schedule 15 to the Bill provide significant other powers for the Secretary of State to interfere with the objective and impartial functioning of the information commission by the appointment of non-executive members of the newly formed commission. The guarantee of the independence of the ICO is intended to ensure the effectiveness and reliability of its regulatory function and that the monitoring and enforcement of data protection laws are carried out objectively and free from partisan or extra-legal considerations.

These amendments would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission, in particular by modifying Schedule 15 to the Bill to transfer budget responsibility and the appointment process of the non-executive members of the information commission to the relevant Select Committee. If so amended, the Bill would ensure that the new information commission has sufficient arm’s-length distance from the Government to oversee public and private bodies’ uses of personal data with impartiality and objectivity. DSIT’s delegated powers memorandum to the DPRRC barely mentions any of these powers, yet they are of considerable importance. Therefore, I am not surprised that there was no mention of them, but they are very significant.

We have discussed data adequacy before; of course, in his letter to us, the Minister tried to rebut some of the points we made about it. In fact, he quoted somebody who has briefed me extensively on it and has taken a very different view to the one he alleges she took in a rather partial quotation from evidence taken by the European Affairs Committee, which is now conducting an inquiry into data adequacy and its implications for the UK-EU relationship. We were told by Open Rights Group attendees at a recent meeting with the European Commission that it expressed concern to those present about the risk that the Bill poses to the EU adequacy agreement; this was not under Chatham House rules. It expressed this risk in a meeting at which a number of UK groups were present, which is highly significant in itself.

I mentioned the European Affairs Committee’s inquiry. I understand that the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs has also given written evidence on its concerns about this Bill, its impact on adequacy and how it could impact the agreement. It put its arguments rather strongly. Has the Minister seen this? Is he aware of the written evidence that it has given to the European Affairs Select Committee? I suggest that he becomes aware of it and takes a view on whether we need to postpone Report until we have seen the European Affairs Select Committee’s report. If it comes to the conclusion that data adequacy is at risk, the Government will have to go back to the drawing board in a number of respects on this Bill. If the Select Committee report comes out and says that the impact of the Bill will not be data adequate, it would be rather foolish if we had already gone through Report by that time. Far be it from me not to want the Government to have egg on their face but it would be peculiar if they did not carefully observe the evidence being put to the European Affairs Select Committee and the progress that it is making in its inquiry. I beg to move.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, for introducing his amendments so ably. When I read them, I had a strong sense of déjà vu as attempts by the Government to control the appointments and functioning of new regulators have been a common theme in other pieces of legislation that we have debated in the House and which we have always resisted. In my experience, this occurred most recently in the Government’s proposals for the Office for Environmental Protection, which was dealing with EU legislation being taken into by the UK and is effectively the environment regulator. We were able to get those proposals modified to limit the Secretary of State’s involvement; we should do so again here.

I very much welcome the noble Lord’s amendments, which give us a chance to assess what level of independence would be appropriate in this case. Schedule 15 covers the transition from the Information Commissioner’s Office to the appointment of the chair and non-executive members of the new information commission. We support this development in principle but it is crucial that the new arrangements strengthen rather than weaken the independence of the new commission.

The noble Lord’s amendments would rightly remove the rights of the Secretary of State to decide the number of non-executive members and to appoint them. Instead, his amendments propose that the chair of the relevant parliamentary committee should oversee appointments. Similarly, the amendments would remove the right of the Secretary of State to recommend the appointment and removal of the chair; again, this should be passed to the relevant parliamentary committee. We agree with these proposals, which would build in an additional tier of parliamentary oversight and help remove any suspicion that the Secretary of State is exercising unwarranted political pressure on the new commission.

The noble Lord’s amendments beg the question of what the relevant parliamentary committee might be. Although we are supportive of the wording as it stands, it is regrettable that we have not been able to make more progress on establishing a strong bicameral parliamentary committee to oversee the work of the information commission. However, in the absence of such a committee, we welcome the suggestion made in the noble Lord’s Amendment 256 that the Commons Science, Innovation and Technology Committee could fulfil that role.

Finally, we have tabled Amendment 259, which addresses what is commonly known as the “revolving door” whereby public sector staff switch to jobs in the private sector and end up working for industries that they were supposedly investigating and regulating previously. This leads to accusations of cronyism and corruption; whether or not there is any evidence of this, it brings the reputation of the whole sector into disrepute. Perhaps I should have declared an interest at the outset: I am a member of the Advisory Committee on Business Appointments and therefore have a ringside view of the scale of the revolving door taking place, particularly at the moment. We believe that it is time to put standards in public life back at the heart of public service; setting new standards on switching sides should be part of that. Our amendment would put a two-year ban on members of the information commission accepting employment from a business that was subject to enforcement action or acting for persons who are being investigated by the agency.

I hope that noble Lords will see the sense and importance of these amendments. I look forward to the Minister’s response.

--- Later in debate ---
For these reasons, I hope noble Lords will be content to withdraw their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response, dusty though it may have been. The noble Baroness, Lady Jones, is absolutely right; this Government have form in all areas of regulation. In every area where we have had legislation related to a regulator coming down the track, the Government have taken more power on and diminished parliamentary oversight rather than enhancing it.

It is therefore a little rich to say that accountability to Parliament is the essence of all this. That is not the impression one gets reading the data protection Bill; the impression you get is that the Government are tightening the screw on the regulator. That was the case with Ofcom in the Online Safety Act; it is the case with the CMA; the noble Baroness, Lady Jones, mentioned her experience as regards the environment. Wherever you look, the Government are tightening their control over the regulators. It is something the Industry and Regulators Committee has been concerned about. We have tried to suggest various formulae. A Joint Committee of both Houses was proposed by the Communications and Digital Committee; it has been endorsed by a number of other committees, such as the Joint Committee on the Draft Online Safety Bill, and I think it has even been commended by the Industry and Regulators Committee as well in that respect.

We need to crack this one. On the issue of parliamentary accountability for the regulator and oversight, the balance is not currently right. That applies particularly in terms of appointments, in this case of the commissioner and the non-executives. The Minister very conveniently talked about removal but this could be about renewal of term, and it is certainly about appointment. So maybe the Minister was a little bit selective with the example he chose to say where the control was.

We are concerned about the independence of the regulator. The Minister did not give an answer, so I hope that he will write about whether he knows what the European Affairs Select Committee is up to. I made a bit of a case on that. Evidence is coming in, and the relevant committee in the European Parliament is giving evidence. The Minister, the noble Viscount, Lord Camrose, was guilty of this in a way, but the way that the data adequacy aspect is seen from this side of the North Sea seems rather selective. The Government need to try to try to put themselves in the position of the Commission and the Parliament on the other side of the North Sea and ask, “What do we think are the factors that will endanger our data adequacy as seen from that side?” The Government are being overly complacent in regarding it as “safe” once the Bill goes through.

It was very interesting to hear what the noble Baroness had to say about the revolving door issues. The notable thing about this amendment is how limited it is; it is not blanket. It would be entirely appropriate to have this in legislation, given the sensitivity of the roles that are carried out by senior people at the ICO.

However, I think we want to make more progress tonight, so I beg leave to withdraw my amendment.

Amendment 254 withdrawn.
--- Later in debate ---
As I said at the outset, it was my determined wish that the Government deal with this issue quickly, seamlessly and relatively privately, but they have not. Although I will listen very carefully to the Minister when he replies, I make utterly clear that this is an issue that urgently needs resolving. If we cannot do so in Committee, I intend to draw the importance of the issue to the attention of noble Lords who are not following our proceedings and ask them to support its inclusion in the Bill. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, as ever, the noble Baroness, Lady Kidron, has nailed this issue. She has campaigned tirelessly in the field of child sexual abuse and has identified a major loophole.

What has been so important is learning from experience and seeing how these new generative AI models, which we have all been having to come to terms with them for the past 18 months, are so powerful in the hands of ordinary people who want to cause harm and sexual abuse. The important thing is that, under existing legislation, there are of course a number of provisions relating to creating deepfake child pornography, the circulation of pornographic deepfakes and so on. However, as the noble Baroness said, what the legislation does not do is go upstream to the AI system—the AI model itself—to make sure that those who develop those models are caught as well. That is what a lot of the discussion around deepfakes is about at the moment—it is, I would say, the most pressing issue—but it is also about trying to nail those AI system owners and users at the very outset, not waiting until something is circulated or, indeed, created in the first place. We need to get right up there at the outset.

I very much support what the noble Baroness said; I will reserve any other remarks for the next group of amendments.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.

As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.

Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.

As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.

Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.

--- Later in debate ---
Moved by
293: After Clause 149, insert the following new Clause—
“Deepfakes depicting sexual offences or activity without consent(1) It is an offence for a person to intentionally create, alter, or otherwise generate a deepfake depicting an intimate act. (2) A person is not guilty of an offence by virtue of subsection (1) if they show the person or persons, being over the age of 18, depicted in the deepfake provided consent for the creation, alteration or generation of the deepfake.(3) Offences under this section are punishable either on conviction on indictment or on summary conviction.(4) A person convicted on indictment of an offence under this section is liable to imprisonment for a term of not more than ten years, or to a fine not exceeding the prescribed sum for the purposes of this Act or to both.(5) A person convicted summarily of an offence under this section is liable—(a) to imprisonment for a term not exceeding six months; or(b) to a fine not exceeding the prescribed sum for the purposes of this Act.(6) The Secretary of State must by regulations prescribe the sum for the purposes subsections (4) and (5).(7) Regulations made under subsection (6) are subject to the affirmative procedure.”Member's explanatory statement
This amendment would make it an offence to intentionally generate a deepfake depicting activity without consent.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will speak to all the amendments in this group, other than Amendment 295 from the noble Baroness, Lady Jones. Without stealing her thunder, I very much support it, especially in an election year and in the light of the deepfakes we have already seen in the political arena—those of Sadiq Khan, those used in the Slovakian election and the audio deepfakes of the President of the US and Sir Keir Starmer. This is a real issue and I am delighted that she has put down this amendment, which I have signed.

In another part of the forest, the recent spread of deepfake photos purporting to show Taylor Swift engaged in explicit acts has brought new attention to the use, which has been growing in recent years, of deepfake images, video and audio to harass women and commit fraud. Women constitute 99% of the victims and the most visited deepfake site had 111 million users in October 2023. More recently, children have been found using “declothing” apps, which I think the noble Baroness mentioned, to create explicit deepfakes of other children.

Deepfakes also present a growing threat to elections and democracy, as I have mentioned, and the problems are increasingly rampant. Deepfake fraud rates rose by 3,000% globally in 2023, and it is hardly surprising that, in recent polling, 86% of the UK population supported a ban on deepfakes. I believe that the public are demanding an urgent solution to this problem. The only effective way to stop deepfakes, which is analogous to what the noble Baroness, Lady Kidron, has been so passionately advocating, is for the Government to ban them at every stage, from production to distribution. Legal liability must hold to account those who produce deepfake technology, create and enable deepfake content, and facilitate its spread.

Existing legislation seeks to limit the spread of images on social media, but this is not enough. The recent images of Taylor Swift were removed from X and Telegram, but not before one picture had been viewed more than 47 million times. Digital watermarks are not a solution, as shown by a paper by world-leading Al researchers released in 2023, which concluded that

“strong and robust watermarking is impossible to achieve”.

Without measures across the supply chain to prevent the creation of deepfakes, the law will forever be playing catch-up.

The Government now intend to ban the creation of sexual imagery deepfakes; I welcome this and have their announcement in my hand:

“Government cracks down on ‘deepfakes’ creation”.


This will send a clear message that the creation of these intimate images is not acceptable. However, this appears to cover only sexual image deepfakes. These are the most prevalent form of deepfakes, but other forms of deepfakes are also causing noticeable and rapidly growing harms, most obviously political deepfakes—as the noble Baroness, Lady Jones, will illustrate—and deepfakes used for fraud. This also appears to cover only the endpoint of the creation of deepfakes, not the supply chain leading up to that point. There are whole apps and companies dedicated to the creation of deepfakes, and they should not exist. There are industries which provide legitimate services—generative Al and cloud computing—which fail to take adequate measures and end up enabling creation of deepfakes. They should take measures or face legal accountability.

The Government’s new measures are intended to be introduced through an amendment to the Criminal Justice Bill, which is, I believe, currently between Committee and Report in the House of Commons. As I understand it, however, there is no date scheduled yet for Report, as the Bill seems to be caught in a battle over amendments.

The law will, however, be extremely difficult to enforce. Perpetrators are able to hide behind anonymity and are often difficult to identify, even when victims or authorities are aware that deepfakes have been created. The only reliable and effective countermeasure is to hold the whole supply chain responsible for deepfake creation and proliferation. All parties involved in the AI supply chain, from AI model developers and providers to cloud compute providers, must demonstrate that they have taken steps to preclude the creation of deepfakes. This approach is similar to how society combats—or, rather, analogous to the way that I hope the Minister will concede to the noble Baroness, Lady Kidron, society will combat—child abuse material and malware.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.

Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.

The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.

The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.

Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.

In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.

In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.

Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.

I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I would be very happy to get a letter from the Minister.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would be happy to write one. I will go for the abbreviated version of my speech.

I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.

The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.

Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.

Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.

I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.

Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.

For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.

Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.

Amendment 293 withdrawn.
--- Later in debate ---
Moved by
295G: After Clause 149, insert the following new Clause—
“Data risks from systemic competitors and hostile actors(1) The Secretary of State, in consultation with the Information Commissioner, must conduct a risk assessment on the data privacy risks associated with genomics and DNA companies that are headquartered in countries they determine to be systemic competitors and hostile actors.(2) Within 12 months of the passage of this Act, the Secretary of State must present this risk assessment report to Parliament and consult the intelligence and security agencies on the findings, taking into account the need to not make public information critical to national defence or ongoing operations.(3) This risk assessment must evaluate—(a) the potential for genomic and DNA data to be exfiltrated outside of the UK,(b) the degree of access granted to foreign entities, particularly those linked to systemic competitors and hostile actors, to the genomic and DNA data collected within the UK,(c) the potential misuse of genomic and DNA data for dual-use or other nefarious purposes,(d) the implications for UK national security and strategic advantage,(e) the risks to the privacy and rights of UK citizens, and (f) the potential for such data to be used in a manner that could compromise the privacy or security of UK citizens or the national interest.(4) The risk assessment must include, but is not limited to—(a) an analysis of the data handling and storage practices of genomics companies that are based in countries designated as systemic competitors and hostile actors,(b) an independent audit at any company site that could have access to UK genomics data, and(c) evidence of clear disclosure statements to consumers of products and services from genomics companies subject to data handling and disclosure requirements in the countries they are headquartered.(5) This risk assessment must be conducted as frequently as deemed necessary by the Secretary of State or the Information Commissioner to address evolving threats and ensure continued protection of the genomics sector from malign entities controlled, directly or indirectly, by countries designated as systemic competitors and hostile actors.(6) The Secretary of State has the authority to issue directives or guidelines based on the findings of the risk assessment to ensure compliance by companies or personnel operating within the genomics sector in the UK, safeguarding against identified risks and vulnerabilities to data privacy.”Member’s explanatory statement
This amendment seeks to ensure sufficient scrutiny of emerging national security and data privacy risks related to advanced technology and areas of strategic interest for systemic competitors and hostile actors. It aims to inform the development of regulations or guidelines necessary to mitigate risks and protect the data privacy of UK citizens’ genomics data and the national interest. It seeks to ensure security experts can scrutinise malign entities and guide researchers, consumers, businesses, and public bodies.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, what a relief—we are at the final furlong.

The UK is a world leader in genomics, which is becoming an industry of strategic importance for future healthcare and prosperity, but, frankly, it must do more to protect the genomic sector from systemic competitors that wish to dominate this industry for either economic advantage or nefarious purposes. Genomic sequencing—the process of determining the entirety of an organism’s DNA—is playing an increasing role in our NHS, which has committed to being the first national healthcare system to offer whole-genome sequencing as part of routine care. However, like other advanced technologies, our sector is exposed to data privacy and national security risks. Its dual-use potential means that it can also be used to create targeted bioweapons or genetically enhanced military. We must ensure that a suitable data protection environment exists to maintain the UK’s world-leading status.

So, how are we currently mitigating against such threats and why is our existing approach so flawed? Although I welcome initiatives such as the Trusted Research campaign and the Research Collaboration Advice Team, these bodies focus specifically on research and academia. We expect foreign companies that hold sensitive genomics and DNA to follow GDPR. I am not a hawk about relations with other countries, but we need to provide the new Information Commissioner with much greater expertise and powers to tackle complex data security threats in sensitive industries. There must be no trade-off between scientific collaboration and data privacy; that is what this amendment is designed to prevent. I beg to move.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.

We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.

I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will be very brief as well.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I would be extremely happy for the Minister to write.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.

I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.

Amendment 295G withdrawn.

Data Protection and Digital Information Bill

Lord Clement-Jones Excerpts
Debate on whether Clause 44 should stand part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I start today with probably the most innocuous of the amendments, which is that Clause 44 should not stand part. Others are more significant, but its purpose, if one can describe it as such, is as a probing clause stand part, to see whether the Minister can explain the real motive and impact of new Section 164A, which is inserted by Clause 44. As the explanatory statement says, it appears to hinder

“data subjects’ right to lodge complaints, and extends the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the Commissioner’s response to a complaint.”

I am looking to the Minister to see whether he can unpack the reasons for that and what the impact is on data subjects’ rights.

More fundamental is Amendment 153, which relates to Clause 45. This provision inserts new Section 165A into the Data Protection Act, according to which the commissioner would have the discretion to refuse to act on a complaint if the complainant did not try to resolve the infringement of their rights with the relevant organisation and at least 45 days have passed since then. The right to an effective remedy constitutes a core element of data protection—most individuals will not pursue cases before a court, because of the lengthy, time- consuming and costly nature of judicial proceedings—and acts as a deterrent against data protection violations, in so far as victims can obtain meaningful redress. Administrative remedies are particularly useful, because they focus on addressing malpractice and obtaining meaningful changes in how personal data is handled in practice.

However, the ICO indicates that in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Moreover, avenues to challenge ICO inaction are extremely limited. Scrutiny of the information tribunal has been restricted to a purely procedural as opposed to a substantive nature. It was narrowed even further by the Administrative Court decision, which found that the ICO was not obliged to investigate each and every complaint.

Amendment 153 would remove Clause 45. The ICO already enjoys a wide margin of discretion and little accountability for how it handles complaints. In light of its poor performance, it does not seem appropriate to expand the discretion of the new information commission even further. It would also extend the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the commissioner’s response to a complaint. This would allow individuals to promote judicial scrutiny over decisions that have a fundamental impact into how laws are enforced in practice and it would increase the overall accountability of the new information commission.

We have signed Amendment 154, in the name of the noble Baroness, Lady Jones, and I look forward to hearing what she says on that. I apologise for the late tabling of Amendments 154A to 154F, which are all related to Amendments 155 and 175. Clause 47 sets out changes in procedure in the courts, in relation to the right of information of a data subject under the 2018 Act, but there are other issues that need resolving around the jurisdiction of the courts and the Upper Tribunal in data protection cases. That is the reason for tabling these amendments.

The High Court’s judgment in the Delo v ICO case held that part of the reasoning in Killock and Veale about the relative jurisdiction of the courts and tribunals was wrong. The Court of Appeal’s decision in the Delo case underlines concerns, but does not properly address the jurisdictions’ limits in Sections 166 and 167 of the 2018 Act, regarding the distinction between determining procedural failings or the merits of decisions by the ICO. Surely jurisdiction under these sections should be in either the courts or the tribunals, not both. In the view of many, including me, it should be in the tribunals. That is what these amendments seek.

It is clear from these two judgments that there was disagreement on the extent of the jurisdiction of tribunals and courts, notably between Mrs Justice Farbey and Mr Justice Mostyn. The commissioner submitted very different submissions to the Upper Tribunal, the High Court and the Court of Appeal, in relation to the extent and limits of Sections 166 and 167. It is not at all clear what Parliament’s intentions were, when passing the 2018 Act, on the extents and limits of the powers in these sections and whether the appropriate source of redress is a court or tribunal.

This has resulted in jurisdictional confusion. A large number of claims have been brought in either the courts or the tribunals, under either Section 166 or Section 167, and the respective court or tribunal has frequently ruled that the claim should have been made under the other section and it therefore does not have jurisdiction, so that the claim is struck out. The Bill offers a prime opportunity to resolve this issue.

Clause 45(5), which creates new Section 166A, would only blur the lines even more and fortify the reasoning for the claim to be put into the tribunals, rather than the courts. These amendments would give certainty to the courts and tribunals as to their powers and would be much less confusing for litigants in person, most of whom do not have the luxury of paying hundreds of thousands in court fees. This itself is another reason for this to remain in the tribunals, which do not charge fees to issue proceedings.

The proposed new clause inserted by Amendment 287 would require the Secretary of State to exercise powers under Section 190 of the 2018 Act to allow public interest organisations to raise data protection complaints on behalf of individuals generally, without the need to obtain the authorisation of each individual being represented. It would therefore implement Article 80(2) of the GDPR, which provides:

“Member States may provide that any body, organisation or association referred to in paragraph 1 of this Article, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to Article 77 and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing”.


The intention behind Article 80(2) is to allow appropriately constituted organisations to bring proceedings concerning infringements of the data protection regulations in the absence of the data subject. That is to ensure that proceedings may be brought in response to an infringement, rather than on the specific facts of an individual’s case. As a result, data subjects are, in theory, offered greater and more effective protection of their rights. Actions under Article 80(2) could address systemic infringements that arise by design, rather than requiring an individual to evidence the breaches and the specific effects to them.

At present, an affected individual—a data subject—is always required to bring a claim or complaint to a supervisory authority. Whether through direct action or under Section 187 of the 2018 Act, a data subject will have to be named and engaged. In practice, a data subject is not always identifiable or willing to bring action to address even the most egregious conduct.

Article 80(2) would fill a gap that Article 80(1) and Section 187 of the Data Protection Act are not intended to fill. Individuals can be unwilling to seek justice, exercise their rights and lodge data protection complaints on their own, either for fear of retaliation from a powerful organisation or because of the stigma that may be associated with the matter where a data protection violation occurred. Even a motivated data subject may be unwilling to take action due to the risks involved. For instance, it would be reasonable for that data subject not to want to become involved in a lengthy, costly legal process that may be disproportionate to the loss suffered or remedy available. This is particularly pressing where the infringement concerns systemic concerns rather than where an individual has suffered material or non-material damage as a result of the infringement.

Civil society organisations have long helped complainants navigate justice systems in seeking remedies in the data protection area, providing a valuable addition to the enactment of UK data protection laws. My Amendment 287 would allow public interest organisations to lodge representative complaints, even without the mandate of data subjects, to encourage the filing of well-argued, strategically important cases with the potential to improve significantly the data subject landscape as a whole. This Bill is the ideal opportunity for the Government to implement fully Article 80(2) of the GDPR from international law and plug a significant gap in the protection of UK citizens’ privacy.

In effect, this is unfinished business from our debates on the 2018 Act, when we made several attempts to persuade the Government of the merits of introducing the rights under Article 80(2). I hope that the Government will think again. These are extremely important rights and are available in many other countries governed by a similar GDPR. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, as a veteran of the 2018 arguments on Article 80(2), I rise in support of Amendment 287, which would see its implementation.

Understanding and exercising personal data rights is not straightforward. Even when the rights are being infringed, it is rare that an individual data subject has the time, knowledge or ability to make a complaint to the ICO. This is particularly true for vulnerable groups, including children and the elderly, disadvantaged groups and other groups of people, such as domestic abuse survivors or members of the LGBTQ community, who may have specific reasons for not identifying themselves in relation to a complaint. It is a principle in law that a right that cannot be activated is not fully given.

A data subject’s ability to claim protection is constrained by a range of factors, none of which relates to the validity of their complaint or the level of harm experienced. Rather, the vast majority are prevented from making a complaint by a lack of expertise, capacity, time and money; by the fact that they are not aware that they have data rights; or by the fact that they understand neither that their rights have been infringed nor how to make a complaint about them.

I have considerable experience of this. I remind the Committee that I am chair of the 5Rights Foundation, which has raised important and systemic issues of non-compliance with the AADC. It has done this primarily by raising concerns with the ICO, which has then undertaken around 40 investigations based on detailed submissions. However, because the information is not part of a formalised process, the ICO has no obligation to respond to the 5Rights Foundation team, the three-month time limit for complaints does not apply and, even though forensic work by the 5Rights Foundation identified the problem, its team is not consulted or updated on progress or the outcome—all of which would be possible had it submitted the information as a formal complaint. I remind the Committee that in these cases we are talking about complaints involving children.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, just so that the Minister might get a little note, I will ask a question. He has explained what is possible—what can be done—but not why the Government still resist putting Article 80(2) into effect. What is the reason for not adopting that article?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The reason was that an extensive consultation was undertaken in 2021 by the Government, and the Government concluded at that time that there was insufficient evidence to take what would necessarily be a complex step. That was largely on the grounds that class actions of this type can go forward either as long as they have the consent of any named individuals in the class action or on behalf of a group of individuals who are unnamed and not specifically raised by name within the investigation itself.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps the Minister could in due course say what evidence would help to persuade the Government to adopt the article.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

I want to help the Minister. Perhaps he could give us some more detail on the nature of that consultation and the number of responses and what people said in it. It strikes me as rather important.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Fair enough. Maybe for the time being, it will satisfy the Committee if I share a copy of that consultation and what evidence was considered, if that would work.

I will turn now to Amendments 154A to 155 and Amendment 175, which propose sweeping modifications to the jurisdiction of the court and tribunal for proceedings under the Data Protection Act 2018. These amendments would have the effect of making the First-tier Tribunal and Upper Tribunal responsible for all data protection cases, transferring both ongoing and future cases out of the court system and to the relevant tribunals.

The Government of course want to ensure that proceedings for enforcement of data protection rules, including redress routes available to data subjects, are appropriate for the nature of the complaint. As the Committee will be well aware, at present there is a mixture of jurisdiction for tribunals and courts under data protection legislation, depending on the precise nature of the proceedings in question. Tribunals are indeed the appropriate venue for some data protection proceedings, and the legislation already recognises that—for example, for application by data subjects for an order requiring the ICO to progress their complaint. However, courts are generally the more appropriate venue for cases involving claims for compensation and successful parties can usually recover their costs. Courts also apply stricter rules of procedure and evidence than tribunals. That is because some cases are appropriate to fall under the jurisdiction of the tribunal, while others are more appropriate for court jurisdiction. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensatory damages for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in accordance with its strict procedural and evidential rules, where the data subject may recover their costs if successful.

As such, the Government are confident that the current system is balanced and proportionate and provides clear and effective administrative and judicial redress routes for data subjects seeking to exercise their rights.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, is the Minister saying that there is absolutely no confusion between the jurisdiction of the tribunals and the courts? That is, no court has come to a different conclusion about jurisdiction—for example, as to whether procedural matters are for tribunals and merits are for courts or vice versa. Is he saying that everything is hunky-dory and clear and that we do not need to concern ourselves with this crossover of jurisdiction?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

No, as I was about to say, we need to take these issues seriously. The noble Lord raised a number of specific cases. I was unfamiliar with them at the start of the debate—

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will go away and look at those; I look forward to learning more about them. There are obvious implications in what the noble Lord said as to the most effective ways of distributing cases between courts and other channels.

For these reasons, I hope that the noble Lord will withdraw his amendment.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would be very happy to participate in that discussion, absolutely.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response. I have surprised myself: I have taken something positive away from the Bill.

The noble Baroness, Lady Jones, was quite right to be more positive about Clause 44 than I was. The Minister unpacked its relationship with Clause 45 well and satisfactorily. Obviously, we will read Hansard before we jump to too positive a conclusion.

On Article 80(2), I am grateful to the Minister for agreeing both to go back to the consultation and to look at the kinds of evidence that were brought forward, because this is a really important aspect for many civil society organisations. He underestimates the difficulties faced when bringing complaints of this nature. I would very much like this conversation to go forward because this issue has been quite a bone of contention; the noble Baroness, Lady Kidron, remembers that only too well. We may even have had ping-pong on the matter back in 2017. There is an appetite to keep on the case so, the more we can discuss this matter—between Committee and Report in particular—the better, because there is quite a head of steam behind it.

As far as the jurisdiction point is concerned, I think this may be the first time I have heard a Minister talk about the Sorting Hat. I was impressed: I have often compared this place to Hogwarts but the concept of using the Sorting Hat to decide whether a case goes to a tribunal or a court is a wonderful one. You would probably need artificial intelligence to do that kind of thing nowadays; that in itself is a bit of an issue because, after all, these may be elaborate amendments but, as the noble Lord, Lord Bassam, said, the case being made here is about the possibility of there being confusion and things not being clear in terms of where jurisdiction lies. It is really important that we determine whether the courts and tribunals themselves understand this and, perhaps more appropriately, whether they have differing views about it.

We need to get to grips with this; the more the Minister can dig into it, and into Delo, Killock and so on, the better. We are all in the foothills here but I am certainly not going to try to unpack those two judgments and the differences between Mrs Justice Farbey and Mr Justice Mostyn, which are well beyond my competency. I thank the Minister.

Clause 44 agreed.
--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the UK has rightly moved away from the EU concept of supremacy, under which retained EU law would always take precedence over domestic law when they were in conflict. That is clearly unacceptable now that we have left the EU. However, we understand that the effective functioning of our data protection legislation is of critical importance and it is appropriate for us to specify the appropriate relationship between UK and EU-derived pieces of legislation following implementation of the Retained EU Law (Revocation and Reform) Act, or REUL. That is why I am introducing a number of specific government amendments to ensure that the hierarchy of legislation works in the data protection context. These are Amendments 156 to 164 and 297.

Noble Lords may be aware that Clause 49 originally sought to clarify the relationship between the UK’s data protection legislation, specifically the UK GDPR and EU-derived aspects of the Data Protection Act 2018, and future data processing provisions in other legislation, such as powers to share or duties to disclose personal data, as a result of some legal uncertainty created by the European Union (Withdrawal) Act 2018. To resolve this uncertainty, Clause 49 makes it clear that all new data processing provisions in legislation should be read consistently with the key requirements of the UK data protection legislation unless it is expressly indicated otherwise. Since its introduction, the interpretation of pre-EU exit legislation has been altered and there is a risk that this would produce the wrong effect in respect of the interpretation of existing data processing provisions that are silent about their relationship with the data protection legislation.

Amendment 159 will make it clear that the full removal of the principle of EU law supremacy and the creation of a reverse hierarchy in relation to assimilated direct legislation, as provided for in the REUL Act, do not change the relationship between the UK data protection legislation and existing legislation that is in force prior to commencement of Clause 49(2). Amendment 163 makes a technical amendment to the EU withdrawal Act, as amended, to support this amendment.

Amendment 162 is similar to the previous amendment but it concerns the relationship between provisions relating to certain obligations and rights under data protection legislation and on restrictions and prohibitions on the disclosure of information under other existing legislation. Existing Section 186 of the Data Protection Act 2018 governs this relationship. Amendment 162 makes it clear that the relationship between these two types of provision is not affected by the changes to the interpretation of legislation that I have already referred to made by the REUL Act. Additionally, it clarifies that, in relation to pre-commencement legislation, Section 186(1) may be disapplied expressly or impliedly.

Amendment 164 relates to the changes brought about by the REUL Act and sets out that the provisions detailed in earlier Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act.

Amendment 297 provides a limited power to remove provisions that achieve the same effect as new Section 183A from legislation made or passed after this Bill receives Royal Assent, as their presence could cause confusion.

Finally, Amendments 156 and 157 are consequential. Amendments 158, 160 and 161 are minor drafting changes made for consistency, updating and consequential purposes.

Turning to the amendments introduced by the noble Lord, Lord Clement-Jones, I hope that he can see from the government amendments to Clause 49 that we have given a good deal of thought to the impact of the REUL Act 2023 on the UK’s data protection framework and have been prepared to take action on this where necessary. We have also considered whether some of the changes made by the REUL Act could cause confusion about how the UK GDPR and the Data Protection Act 2018 interrelate. Following careful analysis, we have concluded that they would largely continue to be read alongside each other in the intended way, with the rules of the REUL Act unlikely to interfere with this. Any new general rule such as that suggested by the noble Lord could create confusion and uncertainty.

Amendments 168 to 170, 174, 174A and 174B seek to reverse changes introduced by the REUL Act at the end of 2023, specifically the removal of EU general principles from the statute book. EU general principles and certain EU-derived rights had originally been retained by the European Union (Withdrawal) Act to ensure legal continuity at the end of the transition period, but this was constitutionally novel and inappropriate for the long term.

The Government’s position is that EU law concepts should not be used to interpret domestic legislation in perpetuity. The REUL Act provided a solution to this by repealing EU general principles from UK law and clarifying the approach to be taken domestically. The amendments tabled by the noble Lord, Lord Clement-Jones, would undo this important work by reintroducing to the statute book references to rights and principles which have not been clearly defined and are inappropriate now that we have left the EU.

The protection of personal data already forms part of the protection offered by the European Convention on Human Rights, under the Article 8 right to respect for private and family life, and is further protected by our data protection legislation. The UK GDPR and the Data Protection Act 2018 provide a comprehensive set of rules for organisations to follow and rights for people in relation to the use of their data. Seeking to apply an additional EU right to data protection in UK law would not significantly affect the way the data protection framework functions or enhance the protections it affords to individuals. Indeed, doing so may well add unnecessary uncertainty and complexity.

Amendments 171 to 173 pertain to exemptions to specified data subject rights and obligations on data controllers set out in Schedules 2 to 4 to the DPA 2018. The 36 exemptions apply only in specified circumstances and are subject to various safeguards. Before addressing the amendments the noble Lord has tabled, it is perhaps helpful to set out how these exemptions are used. Personal data must be processed according to the requirements set out in the UK GDPR and the DPA 2018. This includes the key principles of lawfulness, fairness and transparency, data minimisation and purpose limitation, among others. The decision to restrict data subjects’ rights, such as the right to be notified that their personal data is being processed, or limit obligations on the data controller, comes into effect only if and when the decision to apply an exemption is taken. In all cases, the use of the exemption must be both necessary and proportionate.

One of these exemptions, the immigration exemption, was recently amended in line with a court ruling that found it was incompatible with the requirements set out in Article 23. This exemption is used by the Home Office. The purpose of Amendments 171 to 173 is to extend the protections applied to the immigration exemption across the other exemptions subject to Article 23, apart from in Schedule 4, where the requirement to consider whether its application prejudices the relevant purposes is not considered relevant.

The other exemptions are each used in very different circumstances, by different data controllers—from government departments to SMEs—and work by applying different tests that function in a wholly different manner from the immigration exemption. This is important to bear in mind when considering these broad-brush amendments. A one-size-fits-all approach would not work across the exemption regime.

It is the Government’s position that any changes to these important exemptions should be made only after due consideration of the circumstances of that particular exemption. In many cases, these amendments seek to make changes that run counter to how the exemption functions. Making changes across the exemptions via this Bill, as the noble Lord’s amendments propose, has the potential to have significant negative impacts on the functioning of the exemptions regime. Any potential amendments to the other exemptions would require careful consideration. The Government note that there is a power to make changes to the exemptions in the DPA 2018, if deemed necessary.

For the reasons I have given, I look forward to hearing more from the noble Lord on his amendments, but I hope that he will not press them. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for that very careful exposition. I feel that we are heavily into wet towel, if not painkiller, territory here, because this is a tricky area. As the Minister might imagine, I will not respond to his exposition in detail, at this point; I need to run away and get some external advice on the impact of what he said. He is really suggesting that the Government prefer a pick ‘n’ mix approach to what he regards as a one size fits all. I can boil it down to that. He is saying that you cannot just apply the rules, in the sense that we are trying to reverse some of the impacts of the previous legislation. I will set out my stall; no doubt the Minister and I, the Box and others, will read Hansard and draw our own conclusions at the end, because this is a complicated area.

Until the end of 2023, the Data Protection Act 2018 had to be read compatibly with the UK GDPR. In a conflict between the two instruments, the provisions of the UK GDPR would prevail. The reversing of the relationship between the 2018 Act and the UK GDPR, through the operation of the Retained EU Law (Revocation and Reform) Act—REUL, as the Minister described it—has had the effect of lowering data protection rights in the UK. The case of the Open Rights Group and the3million v the Secretary of State for the Home Office and the Secretary of State for Digital, Culture, Media and Sport was decided after the UK had left the EU, but before the end of 2023. The Court of Appeal held that exemptions from data subject rights in an immigration context, as set out in the Data Protection Act, were overly broad, contained insufficient safeguards and were incompatible with the UK GDPR. The court disapplied the exemptions and ordered the Home Office to redraft them to include the required safeguards. We debated the regulations the other day, and many noble Lords welcomed them on the basis that they had been revised for the second time.

This sort of challenge is now not possible, because the relationship between the DPA and the UK GDPR has been turned on its head. If the case were brought now, the overly broad exemptions in the DPA would take precedence over the requirement for safeguards set out in the UK GDPR. These points were raised by me in the debate of 12 December, when the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 were under consideration. In that debate, the noble Baroness, Lady Swinburne, stated that

“we acknowledge the importance of making sure that data processing provisions in wider legislation continue to be read consistently with the data protection principles in the UK GDPR … Replication of the effect of UK GDPR supremacy is a significant decision, and we consider that the use of primary legislation is the more appropriate way to achieve these effects, such as under Clause 49 where the Government consider it appropriate”.—[Official Report, 12/12/23; col. GC 203.]

This debate on Clause 49 therefore offers an opportunity to reinstate the previous relationship between the UK GDPR and the Data Protection Act. The amendment restores the hierarchy, so that it guarantees the same rights to individuals as existed before the end of 2023, and avoids unforeseen consequences by resetting the relationship between the UK GDPR and the DPA 2018 to what the parliamentary draftsmen intended when the Act was written. The provisions in Clause 49, as currently drafted, address the relationship between domestic law and data protection legislation as a whole, but the relationship between the UK GDPR and the DPA is left in its “reversed” state. This is confirmed in the Explanatory Notes to the Bill at paragraph 503.

The purpose of these amendments is to restore data protection rights in the UK to what they were before the end of 2023, prior to the coming into force of REUL. The amendments would restore the fundamental right to the protection of personal data in UK law; ensure that the UK GDPR and the DPA continue to be interpreted in accordance with the fundamental right to the protection of personal data; ensure that there is certainty that assimilated case law that references the fundamental right to the protection of personal data still applies; and apply the protections required in Article 23 of the UK GDPR to all the relevant exemptions in Schedule 2 to the Data Protection Act. This is crucial in avoiding diminishing trust in our data protection frameworks. If people do not trust that their data is protected, they will refuse to share it. Without this data, new technologies cannot be developed, because these technologies rely on personal data. By creating uncertainty and diminishing standards, the Government are undermining the very growth in new technologies that they want.

--- Later in debate ---
Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I have added my name to Amendment 195ZA—I will get to understand where these numbers come from, at some point—in the name of the noble Lord, Lord Kamall, who introduced it so eloquently. I will try to be brief in my support.

For many people, probably most, the use of online digital verification will be a real benefit. The Bill puts in place a framework to strengthen digital verification so, on the whole, I am supportive of what the Government are trying to do, although I think that the Minister should seriously consider the various amendments that the noble Baroness, Lady Jones of Whitchurch, has proposed to strengthen parliamentary scrutiny in this area.

However, not everyone will wish to use digital verification in all cases, perhaps because they are not sufficiently confident with technology or perhaps they simply do not trust it. We have already heard the debates around the advances of AI and computer-based decision-making. Digital identity verification could be seen to be another extension of this. There is a concern that Part 2 of the Bill appears to push people ever further towards decisions being taken by a computer.

I suspect that many of us will have done battle with some of the existing identity verification systems. In my own case, I can think of one bank where I gave up in deep frustration as it insisted on telling me that I was not the same person as my driving licence showed. I have also come up against systems used by estate agents when trying to provide a guarantee for my student son that was so intrusive that I, again, refused to use it.

Therefore, improving verification services is to be encouraged but there must be some element of choice, and if someone does not have the know-how, confidence, or trust in the systems, they should be able to do so through some non-digital alternative. They should not be barred from using relevant important services such as, in my examples, banking and renting a property because they cannot or would prefer not to use a digital verification service.

At the very least, even if the Minister is not minded to accept that amendment, I hope that he can make clear that the Government have no intention to make digital ID verification mandatory, as some have suggested that this Part 2 may be driving towards.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, this is quite a disparate group of amendments. I support Amendment 195ZA, which I have signed. I thought that the noble Baroness, Lady Jones, and the noble Lords, Lord Kamall and Lord Vaux, have made clear the importance of having a provision such as this on the statute book. It is important that an individual can choose whether to use digital or non-digital means of verifying their identity. It is important for the liberty and equality of individuals as well as to cultivate trust in what are essentially growing digital identity systems. The use of the word “empower” in these circumstances is important. We need to empower people rather than push them into digital systems that they may not be able to access. Therefore, a move towards digitalisation is not a justification for compelling individuals to use systems that could compromise their privacy or rights more broadly. I very much support that amendment on that basis.

I also very much support the amendments of the noble Baroness, Lady Jones, which I have signed. The Delegated Powers and Regulatory Reform Committee could not have made its recommendations clearer. The Government are serial offenders in terms of skeleton Bills. We have known that from remarks made by the noble Lord, Lord Hodgson, on the Government Benches over a long period. I am going to be extremely interested in what the Government have to say. Quite often, to give them some credit, they listen to what the DPRRC has to say and I hope that on this occasion the Minister is going to give us some good news.

This is an extremely important new system being set up by the Government. We have been waiting for the enabling legislation for quite some time. It is pretty disappointing, after all the consultations that have taken place, just how skeletal it is. No underlying principles have been set out. There is a perfectly good set of principles set out by the independent Privacy and Consumer Advisory Group that advises the Government on how to provide a simple, trusted and secure means of accessing public services. But what assurance do we have that we are going to see those principles embedded in this new system?

Throughout, it is vital that the Secretary of State is obliged to uphold the kinds of concerns being raised in the development of this DVS trust framework to ensure that those services protect the people who use them. We need that kind of parliamentary debate and it has been made quite clear that we need nothing less than that. I therefore very much support what the noble Baroness, Lady Jones, had to say on that subject.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Jones, and my noble friend Lord Kamall for their amendments. To address the elephant in the room first, I can reassure noble Lords that the use of digital identity will not be mandatory, and privacy will remain one of the guiding principles of the Government’s approach to digital identity. There are no plans to introduce a centralised, compulsory digital ID system for public services, and the Government’s position on physical ID cards remains unchanged. The Government are committed to realising the benefits of digital identity technologies without creating ID cards.

I shall speak now to Amendment 177, which would require the rules of the DVS trust framework to be set out in regulations subject to the affirmative resolution procedure. I recognise that this amendment, and others in this group, reflect recommendations from the DPRRC. Obviously, we take that committee very seriously, and we will respond to that report in due course, but ahead of Report.

Part 2 of the Bill will underpin the DVS trust framework, a document of auditable rules, which include technical standards. The trust framework refers to data protection legislation and ICO guidance. It has undergone four years of development, consultation and testing within the digital identity market. Organisations can choose to have their services certified against the trust framework to prove that they provide secure and trustworthy digital verification services. Certification is provided by independent conformity assessment bodies that have been accredited by the UK Accreditation Service. Annual reviews of the trust framework are subject to consultation with the ICO and other appropriate persons.

Requiring the trust framework to be set out in regulations would make it hard to introduce reactive changes. For example, if a new cybersecurity threat emerged which required the rapid deployment of a fix across the industry, the trust framework would need to be updated very quickly. Developments in this fast-growing industry require an agile approach to standards and rule-making. We cannot risk the document becoming outdated and losing credibility with industry. For these reasons, the Government feel that it is more appropriate for the Secretary of State to have the power to set the rules of the trust framework with appropriate consultation, rather than for the power to be exercised by regulations.

I turn to Amendments 178 to 195, which would require the fees that may be charged under this part of the Bill to be set out in regulations subject to the negative resolution procedure. The Government have committed to growing a market of secure and inclusive digital identities as an alternative to physical proofs of identity, for those that choose to use them. Fees will be introduced only once we are confident that doing so will not restrict the growth of this market, but the fee structure, when introduced, is likely to be complex and will need to flex to support growth in an evolving market.

There are built-in safeguards to this fee-charging power. First, there is a strong incentive for the Secretary of State to set fees that are competitive, fair and reasonable, because failing to do so would prevent the Government realising their commitment to grow this market. Secondly, these fee-raising powers have a well-defined purpose and limited scope. Thirdly, the Secretary of State will explain in advance what fees she intends to charge and when she intends to charge them, which will ensure the appropriate level of transparency.

The noble Baroness, Lady Jones, asked about the arrangements for the office for digital identities and attributes. It will not initially be independent, as it will be located within the Department for Science, Innovation and Technology. As we announced in the government response to our 2021 consultation, we intend for this to be an interim arrangement until a suitable long-term home for the governing body can be identified. Delegating the role of Ofdia—as I suppose we will call it—to a third party in the future, is subject to parliamentary scrutiny, as provided for by the clauses in the Bill. Initially placing Ofdia inside government will ensure that its oversight role could mature in the most effective way and that it supports the digital identity market in meeting the needs of individual users, relying parties and industry.

Digital verification services are independently certified against the trust framework rules by conformity assessment bodies. Conformity assessment bodies are themselves independently accredited by the UK Accreditation Service to ensure that they have the competence and impartiality to perform certification. The trust framework certification scheme will be accredited by the UK Accreditation Service to give confidence that the scheme can be efficiently and competently used to certify products, processes and services. All schemes will need to meet internationally agreed standards set out by the UK Accreditation Service. Ofdia, as the owner of the main code, will work with UKAS to ensure that schemes are robust, capable of certification and operated in line with the trust framework.

Amendment 184A proposes to exclude certified public bodies from registering to provide digital verification services. The term “public bodies” could include a wide range of public sector entities, including institutions such as universities, that receive any public funding. The Government take the view that this exclusion would be unnecessarily restrictive in the UK’s nascent digital identity market.

Amendment 195ZA seeks to mandate organisations to implement a non-digital form of verification in every instance where a digital method is required. The Bill enables the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them, nor does it insist that businesses which currently accept non-digital methods of verification must transition to digital methods. As Clause 52 makes clear, digital verification services are services that are provided at the request of the individual. The purpose of the Bill is to ensure that, when people want to use a digital verification service, they know which of the available products and services they can trust.

Some organisations operate only in the digital sphere, such as online-only banks and energy companies. To oblige such organisations to offer manual document checking would place obligations on them that would go beyond the Government’s commitment to do only what is necessary to enable the digital identity market to grow. In so far as this amendment would apply to public authorities, the Equality Act requires those organisations to consider how their services will affect people with protected characteristics, including those who, for various reasons, might not be able or might choose not to use a digital identity product.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

I understand that some services are purely digital, but some of those may well not have digital ID. We do not know what future services there might be, so they might want to show an analogue ID. Is my noble friend saying that that will not be possible because it will impose too much of a burden on those innovative digital companies? Could he clarify what he said?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

On this point, the argument that the Government are making is that, where consumers want to use a digital verification service, all the Bill does is to provide a mechanism for those DVSs to be certified and assured to be safe. It does not seek to require anything beyond that, other than creating a list of safe DVSs.

The Equality Act applies to the public sector space, where it needs to be followed to ensure that there is an absolute right to inclusive access to digital technologies.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in essence, the Minister is admitting that there is a gap when somebody who does not have access to digital services needs an identity to deal with the private sector. Is that right?

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

In the example I gave, I was not willing to use a digital system to provide a guarantee for my son’s accommodation in the private sector. I understand that that would not be protected and that, therefore, someone might not be able to rent a flat, for example, because they cannot provide physical ID.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The Bill does not change the requirements in this sense. If any organisation chooses to provide its services on a digital basis only, that is up to that organisation, and it is up to consumers whether they choose to use it. It makes no changes to the requirements in that space.

I will now speak to the amendment that seeks to remove Clause 80. Clause 80 enables the Secretary of State to ask accredited conformity assessment bodies and registered DVS providers to provide information which is reasonably required to carry out her functions under Part 2 of the Bill. The Bill sets out a clear process that the Secretary of State must follow when requesting this information, as well as explicit safeguards for her use of the power. These safeguards will ensure that DVS providers and conformity assessment bodies have to provide only information necessary for the functioning of this part of the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the clause stand part amendment was clearly probing. Does the Minister have anything to say about the relationship with OneLogin? Is he saying that it is only information about systems, not individuals, which does not feed into the OneLogin identity system that the Government are setting up?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It is very important that the OneLogin system is entirely separate and not considered a DVS. We considered whether it should be, but the view was that that comes close to mandating a digital identity system, which we absolutely want to avoid. Hence the two are treated entirely differently.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That is a good reassurance, but if the Minister wants to unpack that further by correspondence, I would be very happy to have that.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am very happy to do so.

I turn finally to Amendments 289 and 300, which aim to introduce a criminal offence of digital identity theft. The Government are committed to tackling fraud and are confident that criminal offences already exist to cover the behaviour targeted by these amendments. Under the Fraud Act 2006, it is a criminal offence to make a gain from the use of another person’s identity or to cause or risk a loss by such use. Where accounts or databases are hacked into, the Computer Misuse Act 1990 criminalises the unauthorised access to a computer programme or data held on a computer.

Furthermore, the trust framework contains rules, standards and good practice requirements for fraud monitoring and responding to fraud. These rules will further defend systems and reduce opportunities for digital identity theft.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am sorry, but this is a broad-ranging set of amendments, so I need to intervene on this one as well. When the Minister does his will write letter in response to today’s proceedings, could he tell us what guidance there is to the police on this? Because when the individual, Mr Arron, approached the police, they said, “Oh, sorry, there’s nothing we can do; identity theft is not a criminal offence”. The Minister seems to be saying, “No, it is fine; it is all encompassed within these provisions”. While he may be saying that, and I am sure he will be shouting it from the rooftops in the future, the question is whether the police have guidance; does the College of Policing have guidance and does the Home Office have guidance? The ordinary individual needs to know that it is exactly as the Minister says, and identity theft is covered by these other criminal offences. There is no point in having those offences if nobody knows about them.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

That is absolutely fair enough: I will of course write. Sadly, we are not joined today by ministerial colleagues from the Home Office, who have some other Bill going on.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

We always enjoy having input from the Home Office.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I have no doubt that its contribution to the letter will be equally enjoyable. However, for all the reasons I set out above, I am not able to accept these amendments and respectfully encourage the noble Baroness and noble Lords not to press them.

--- Later in debate ---
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, this group has three amendments within it and, as the noble Lord, Lord Vaux, said, it is a disparate group. The first two seem wholly benign and entirely laudable, in that they seek to ensure that concerns about the environmental impacts related to data connected to business are shared and provided. The noble Baroness, Lady Bennett, said hers was a small and modest amendment: I agree entirely with that, but it is valuable nevertheless.

If I had to choose which amendment I prefer, it would be the second, in the name of my noble friend Lady Young, simply because it is more comprehensive and seems to be of practical value in pursuing policy objectives related to climate change mitigation. I cannot see why the disclosure of an impact analysis of current and future announcements, including legislation, changes in targets and large contracts, on UK climate change mitigation targets would be a problem. I thought my noble friend was very persuasive and her arguments about impact assessment were sound. The example of offshore petroleum legislation effectively not having an environmental impact assessment when its impacts are pretty clear was a very good one indeed. I am one of those who believes that environmental good practice should be written all the way through, a bit like a stick of Brighton rock, and I think that about legislation. It is important that we take on board that climate change is the most pressing issue that we face for the future.

The third amendment, in the name of my noble friend Lady Jones, is of a rather different nature, but is no less important, as it relates to the UK’s data adequacy and the EU’s decisions on it. We are grateful to the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, for their support. Put simply, it would oblige the Secretary of State to complete an assessment, within six months of the Bill’s passing,

“of the likely impact of the Act on the EU’s data adequacy decisions relating to the UK”.

It would oblige the Secretary of State to lay a report on the assessment’s findings, and the report must cover data risk assessments and the impact on SMEs. It must also include an estimate of the legislation’s financial impact. The noble Lord, Lord Vaux, usefully underlined the importance of this, with its critical 2025 date. The amendment also probes

“whether the Government anticipate the provisions of the Bill conflicting with the requirements that need to be made by the UK to maintain a data adequacy decision by the EU”.

There is widespread and considerable concern about data adequacy and whether the UK legislative framework diverges too far from the standards that apply under the EU GDPR. The risk that the UK runs in attempting to reduce compliance costs for the free flow of personal data is that safeguards are removed to the point where businesses and trade become excessively concerned. In summary, many sectors including manufacturing, retail, health, information technology and particularly financial services are concerned that the free flow of data between us and the EU, with minimal disruption, will simply not be able to continue.

As the noble Lord, Lord Vaux, underlined, it is important that we in the UK have a relationship of trust with the European Commission on this, although ultimately data adequacy could be tested in the Court of Justice of the European Union. Data subjects in the EU can rely on the general principle of the protection of personal data to invalidate EU secondary and domestic law conflicting with that principle. Data subjects can also rely on the Charter of Fundamental Rights to bring challenges. Both these routes were closed off when the UK left the EU and the provisions were not saved in UK law, so it can be argued that data protection rights are already at a lower standard than across the European Union.

It is worth acknowledging that adequacy does not necessarily require equivalence. We can have different, and potentially lower, standards than the EU but, as long as those protections are deemed to meet whatever criteria the Commission chooses to apply, it is all to the good.

However, while divergence is possible, the concern that we and others have is that the Bill continues chipping away at standards in too many different ways. This chipping away is also taking place in statutory instruments, changes to guidance and so on. If His Majesty’s Government are satisfied that the overall picture remains that UK regulation is adequate, that is welcome, but it would be useful to know what mechanism DSIT and the Government generally intend using to measure where the tipping point might be achieved and how close these reforms take us to it.

The Committee will need considerable reassurance on the question of data adequacy, not least because of its impact on businesses and financial services in the longer term. At various times, the Minister has made the argument that a Brexit benefit is contained within this legislation. If he is ultimately confident of that case, what would be the impact on UK businesses if that assessment is wrong in relation to data adequacy decisions taken within the EU?

We are going to need more than warm words and a recitation that “We think it’s right and that we’re in the right place on data adequacy”. We are going to need some convincing. Whatever the Minister says today, we will have to return to this issue on Report. It is that important for businesses in this country and for the protection of data subjects.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, these amendments have been spoken to so well that I do not need to spend a huge amount of time repeating those great arguments. Both Amendment 195A, put forward by the noble Baroness, Lady Bennett, and Amendment 218 have considerable merit. I do not think that they conflict; they are complementary, in many respects.

Awareness raising is important to this, especially in relation to Amendment 218. For instance, if regulators are going to have a growth duty, which looks like it is going to happen, why not have countervailing duties relating to climate change, as the noble Baroness, Lady Young, put forward so cogently as part of Amendment 218? Amendment 195A also has considerable merit in raising awareness in the private sector, in traders and so on. Both have considerable merit.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister stands up, let me just say that I absolutely agree with what the noble Lord, Lord Bassam, said. Have the Government taken any independent advice? It is easy to get wrapped up in your own bubble. The Government seem incredibly blithe about this Bill. You only have to have gone through our days in this Committee to see the fundamental changes that are being made to data protection law, yet the Government, in this bubble, seem to think that everything is fine despite the warnings coming from Brussels. Are they taking expert advice from outside? Do they have any groups of academics, for instance, who know about this kind of thing? It is pretty worrying. The great benefit of this kind of amendment, put forward by the noble Baroness, Lady Jones, is that nothing would happen until we were sure that we were going to be data adequate. That seems a fantastic safeguard to me. If the Government are just flying blind on this, we are all in trouble, are we not?

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, can I point out, on the interests of the EU, that it does not go just one way? There is a question around investment as well. For example, any large bank that is currently running a data-processing facility in this country that covers the whole of Europe may decide, if we lose data adequacy, to move it to Europe. Anyone considering setting up such a thing would probably go for Europe rather than here. There is therefore an investment draw for the EU here.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes. I would be happy to provide a list of the people we have spoken to about adequacy; it may be a long one. That concludes the remarks I wanted to make, I think.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps the Minister could just tweak that a bit by listing not just the people who have made positive noises but those who have their doubts.

--- Later in debate ---
Viscount Stansgate Portrait Viscount Stansgate (Lab)
- Hansard - - - Excerpts

I do not have the benefit of seeing a Hansard update to know after which word I was interrupted and we had to leave to vote, so I will just repeat, I hope not unduly, the main point I was making at the time of the Division. This was that the central conclusion of the CRISP report is that the Government’s policy

“generates significant gaps in the formal oversight of biometrics and surveillance practices in addition to erasing many positive developments aimed at raising standards and constructive engagement with technology developers, surveillance users and the public”.

The reason I am very glad to support the noble Lord, Lord Holmes, in these amendments is that the complexities of the current regulatory landscape and the protections offered by the BSCC in an era of increasingly intensive advanced and intrusive surveillance mean that the abolition of the BSCC leaves these oversight gaps while creating additional regulatory complexity. I will be interested to see how the Minister defends the fact that this abolition is supposed to improve the situation.

I do not want to detain the Committee for very long, but I shall just read this one passage from the report into the record, because it is relevant to the debate we are having. We should not remove

“a mechanism for assuring Parliament and the public of appropriate surveillance use, affecting public trust and legitimacy at a critical moment concerning public trust in institutions, particularly law enforcement. As drafted, the Bill reduces public visibility and accountability of related police activities. The lack of independent oversight becomes amplified by other sections of the Bill that reduce the independence of the current Information Commissioner role”.

In short, I think it would be a mistake to abolish the biometrics commissioner, and on that basis, I support these amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it has been a pleasure to listen to noble Lords’ speeches in this debate. We are all very much on the same page and have very much the same considerations in mind. Both the protection of biometric data itself and also the means by which we regulate its use and have oversight over how it is used have been mentioned by everyone. We may have slightly different paths to making sure we have that protection and oversight, but we all have the same intentions.

The noble Lord, Lord Holmes, pointed to the considerable attractions of, in a sense, starting afresh, but I have chosen a rather different path. I think it was the noble Lord, Lord Vaux, who mentioned Fraser Sampson, the former Biometrics and Surveillance Camera Commissioner. I must admit that I have very high regard for the work he did, and also for the work of such people as Professor Peter Fussey of Essex University. Of course, a number of noble Lords have mentioned the work of CRISP in all this, which kept us very well briefed on the consequence of these clauses.

No one has yet spoken to the stand part notices on Clauses 130 to 132; I will come on to those on Clauses 147 to 149 shortly. The Bill would drastically change the way UK law enforcement agencies can handle biometric personal data. Clauses 130 to 132 would allow for data received from overseas law enforcement agencies to be stored in a pseudonymised, traceable format indefinitely.

For instance, Clause 130 would allow UK law enforcement agencies to hold biometric data received from overseas law enforcement agencies in a pseudonymised format. In cases where the authority ceases to hold the material pseudonymously and the individual has no previous convictions or only one exempt conviction, the data may be retained in a non-pseudonymous format for up to three years. Therefore, the general rule is indefinite retention with continuous pseudonymisation, except for a specific circumstance where non-pseudonymised retention is permitted for a fixed period. I forgive noble Lords if they have to read Hansard to make total sense of that.

This is a major change in the way personal data can be handled. Permitting storage of pseudonymised or non-pseudonymised data will facilitate a vast biometric database that can be traced back to individuals. Although this does not apply to data linked to offences committed in the UK, it sets a concerning precedent for reshaping how law enforcement agencies hold data in a traceable and identifiable way. It seems that there is nothing to stop a law enforcement agency pseudonymising data just to reattach the identifying information, which they would be permitted to hold for three years.

The clauses do not explicitly define the steps that must be taken to achieve pseudonymisation. This leaves a broad scope for interpretation and variation in practice. The only requirement is that the data be pseudonymised

“as soon as reasonably practicable”,

which is a totally subjective threshold. The collective impact of these clauses, which were a late addition to the Bill on Report in the Commons, is deeply concerning. We believe that these powers should be withdrawn to prevent a dangerous precedent being set for police retention of vast amounts of traceable biometric data.

The stand part notices on Clauses 147 to 149 have been spoken to extremely cogently by the noble Lord, Lord Vaux, the noble Viscount, Lord Stansgate, and the noble Baroness, Lady Harding. I will not repeat a great deal of what they said but what the noble Baroness, Lady Harding, said about the Human Fertilisation and Embryology Authority really struck a chord with me. When we had our Select Committee on Artificial Intelligence, we looked at models for regulation and how to gain public trust for new technologies and concepts. The report that Baroness Warnock did into fertilisation and embryology was an absolute classic and an example of how to gain public trust. As the noble Baroness, Lady Harding, said, it has stood the test of time. As far as I am concerned, gaining that kind of trust is the goal for all of us.

What we are doing here risks precisely the reverse by abolishing the office of the Biometrics and Surveillance Camera Commissioner. This was set up under the Protection of Freedoms Act 2012, which required a surveillance camera commissioner to be appointed and a surveillance camera code of practice to be published. Other functions of the Biometrics and Surveillance Camera Commissioner are in essence both judicial and non-judicial. They include developing and encouraging compliance with the surveillance camera code of practice; raising standards for surveillance camera developers, suppliers and users; public engagement; building legitimacy; reporting annually to Parliament via the Home Secretary; convening expertise to support these functions; and reviewing all national security determinations and other powers by which the police can retain biometric data. The Bill proposes to erase all but one—I stress that—of these activities.

The noble Lord, Lord Vaux, quoted CRISP. I will not repeat the quotes he gave but its report, which the noble Viscount, Lord Stansgate, also cited, warns that

“plans to abolish and not replace existing safeguards in this crucial area will leave the UK without proper oversight just when advances in artificial intelligence (AI) and other technologies mean they are needed more than ever”.

The Bill’s reduction of surveillance-related considerations to data protection compares unfavourably to regulatory approaches in other jurisdictions. Many have started from data protection and extended it to cover the wider rights-based implications of surveillance. Here, the Bill proposes a move in precisely the opposite direction. I am afraid this is yet another example of the Bill going entirely in the wrong direction.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to what has been an excellent debate on this issue. We have all been united in raising our concerns about whether the offices of the biometrics commissioner and the surveillance camera commissioner should be abolished. We all feel the need for more independent oversight, not less, as is being proposed here.

As we know, the original plan was for the work of the biometrics commissioner to be transferred to the Information Commissioner, but when he raised concerns that this would result in the work receiving less attention, it was decided to transfer it to the Investigatory Powers Commissioner instead. Meanwhile, the office of the surveillance camera commissioner is abolished on the basis that these responsibilities are already covered elsewhere. However, like other noble Lords, we remain concerned that the transfer of this increasingly important work from both commissioners will mean that it does not retain the same level of expertise and resources as it enjoys under the current regime.

These changes have caused some alarm among civic society groups such as the Ada Lovelace Institute and the Centre for Research into Information Surveillance and Privacy, to which noble Lords have referred. They argue that we are experiencing a huge expansion in the reach of surveillance and biometric technology. The data being captured, whether faces, fingerprints, walking style, voice or the shape of the human body, are uniquely personal and part of our individual identity. The data being captured can enhance public safety but can also raise critical ethical concerns around privacy, free expression, bias and discrimination. As the noble Lord, Lord Vaux, said, we need a careful balance of those issues between protection and privacy.

The noble Baroness, Lady Harding, quite rightly said that there is increasing public mistrust in the use of these techniques, and that is why there is an urgent need to take people on the journey. The example the noble Baroness gave was vivid. We need a robust legal framework to underpin the use of these techniques, whether it is by the police, the wider public sector or private institutions. As it stands, the changes in the Bill do not achieve that reassurance, and we have a lot of lessons to learn.

Rather than strengthening the current powers to respond to the huge growth and reach of surveillance techniques, the Bill essentially waters down the protections. Transferring the powers from the BSCC to the new Information Commissioner brings the issue down to data protection when the issues of intrusion and the misuse of biometrics and surveillance are much wider than that. Meanwhile, the impact of Al will herald a growth of new techniques such as facial emotional appraisal and video manipulation, leading to such things as deep fakes. All these techniques threaten to undermine our sense of self and our control of our own personal privacy.

The amendment in the name of the noble Lord, Lord Holmes, takes up the suggestion, also made by the Ada Lovelace Institute, to establish a biometrics office within the ICO, overseen by three experienced commissioners. The functions would provide general oversight of biometric techniques, keep a register of biometric users and set up a process for considering complaints. Importantly, it would require all entities processing biometric data to register with the ICO prior to any use.

We believe that these amendments are a really helpful contribution to the discussion. They would place the oversight of biometric techniques in a more effective setting where the full impacts of these techniques can be properly monitored, measured and reported on. We would need more details of the types of work to be undertaken by these commissioners, and the cost implications but, in principle, we support these amendments because they seem to be an answer to our concerns. We thank the noble Lord for tabling them and very much hope the Minister will give the proposals serious consideration.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I apologise if I have misunderstood. It sounds like it would be a unit within the ICO responsible for that matter. Let me take that away if I have misunderstood—I understood it to be a separate organisation altogether.

The Government deem Amendment 238 unnecessary, as using biometric data to categorise or make inferences about people, whether using algorithms or otherwise, is already subject to the general data protection principles and the high data protection standards of the UK’s data protection framework as personal data. In line with ICO guidance, where the processing of biometric data is intended to make an inference linked to one of the special categories of data—for example, race or ethnic origin—or the biometric data is processed for the intention of treating someone differently on the basis of inferred information linked to one of the special categories of data, organisations should treat this as special category data. These protections ensure that this data, which is not used for identification purposes, is sufficiently protected.

Similarly, Amendment 286 intends to widen the scope of the Forensic Information Databases Service—FINDS—strategy board beyond oversight of biometrics databases for the purpose of identification to include “classification” purposes as well. The FINDS strategy board currently provides oversight of the national DNA database and the national fingerprint database. The Bill puts oversight of the fingerprint database on the same statutory footing as that of the DNA database and provides the flexibility to add oversight of new biometric databases, where appropriate, to provide more consistent oversight in future. The delegated power could be used in the medium term to expand the scope of the board to include a national custody image database, but no decisions have yet been taken. Of course, this will be kept under review, and other biometric databases could be added to the board’s remit in future should these be created and should this be appropriate. For the reasons I have set out, I hope that the noble Baroness, Lady Jones of Whitchurch, will therefore agree not to move Amendments 238 and 286.

Responses to the data reform public consultation in 2021 supported the simplification of the complex oversight framework for police use of biometrics and surveillance cameras. Clauses 147 and 148 of the Bill reflect that by abolishing the Biometrics and Surveillance Camera Commissioner’s roles while transferring the commissioner’s casework functions to the Investigatory Powers Commissioner’s Office.

Noble Lords referred to the CRISP report, which was commissioned by Fraser Sampson—the previous commissioner—and directly contradicts the outcome of the public consultation on data reform in 2021, including on the simplification of the oversight of biometrics and surveillance cameras. The Government took account of all the responses, including from the former commissioner, in developing the policies set out in the DPDI Bill.

There will not be a gap in the oversight of surveillance as it will remain within the statutory regulatory remit of other organisations, such as the Information Commissioner’s Office, the Equality and Human Rights Commission, the Forensic Science Regulator and the Forensic Information Databases Service strategy board.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

One of the crucial aspects has been the reporting of the Biometrics and Surveillance Camera Commissioner. Where is there going to be and who is going to have a comprehensive report relating to the use of surveillance cameras and the biometric data contained within them? Why have the Government decided that they are going to separate out the oversight of biometrics from, in essence, the surveillance aspects? Are not the two irretrievably brought together by things such as live facial recognition?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes. There are indeed a number of different elements of surveillance camera oversight; those are reflected in the range of different bodies doing that it. As to the mechanics of the production of the report, I am afraid that I do not know the answer.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Does the Minister accept that the police are one of the key agencies that will be using surveillance cameras? He now seems to be saying, “No, it’s fine. We don’t have one single oversight body; we had four at the last count”. He probably has more to say on this subject but is that not highly confusing for the police when they have so many different bodies that they need to look at in terms of oversight? Is it any wonder that people think the Bill is watering down the oversight of surveillance camera use?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

No. I was saying that there was extensive consultation, including with the police, and that that has resulted in these new arrangements. As to the actual mechanics of the production of an overall report, I am afraid that I do not know but I will find out and advise noble Lords.

His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services also inspects, monitors and reports on the efficiency and effectiveness of the police, including their use of surveillance cameras. All of these bodies have statutory powers to take the necessary action when required. The ICO will continue to regulate all organisations’ use of these technologies, including being able to take action against those not complying with data protection law, and a wide range of other bodies will continue to operate in this space.

On the first point made by the noble Lord, Lord Vaux, where any of the privacy concerns he raises concern information that relates to an identified or identifiable living individual, I can assure him that this information is covered by the UK’s data protection regime. This also includes another issue raised by the noble Lord—where the ANPR captures a number-plate that can be linked to an identifiable living individual—as this would be the processing of personal data and thus governed by the UK’s data protection regime and regulated by the ICO.

For the reasons I have set out, I maintain that these clauses should stand part of the Bill. I therefore hope that the noble Lord, Lord Clement-Jones, will withdraw his stand part notices on Clauses 147 and 148.

Clause 149 does not affect the office of the Biometrics and Surveillance Camera Commissioner, which the noble Lord seeks to maintain through his amendment. The clause’s purpose is to update the name of the national DNA database board and update its scope to include the national fingerprint database within its remit. It will allow the board to produce codes of practice and introduce a new delegated power to add or remove biometric databases from its remit in future via the affirmative procedure. I therefore maintain that this clause should stand part of the Bill and hope that the noble Lord will withdraw his stand part notice.

Clauses 147 and 148 will improve consistency in the guidance and oversight of biometrics and surveillance cameras by simplifying the framework. This follows public consultation, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication. The Government feel that a review, as proposed, so quickly after the Bill is enacted is unnecessary. It is for these reasons that I cannot accept Amendment 292 in the name of the noble Lord, Lord Clement-Jones.

I turn now to the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove Clauses 130 to 132. These clauses make changes to the Counter-Terrorism Act 2008, which provides the retention regime for biometric data held on national security grounds. The changes have been made only following a formal request from Counter Terrorism Policing to the Home Office. The exploitation of biometric material, including from international partners, is a valuable tool in maintaining the UK’s national security, particularly for ensuring that there is effective tripwire coverage at the UK border. For example, where a foreign national applies for a visa to enter the UK, or enters the UK via a small boat, their biometrics can be checked against Counter Terrorism Policing’s holdings and appropriate action to mitigate risk can be taken, if needed.

Moved by
111: Schedule 5, page 206, leave out line 26 to end of line 2 on page 207 and insert—
“(a) the rule of law, respect for human rights and fundamental freedoms, relevant legislation, both general and sectoral, including concerning public security, defence, national security and criminal law and the access of public authorities to personal data, as well as the implementation of such legislation, data protection rules, professional rules and security measures, including rules for the onward transfer of personal data to another third country or international organisation which are complied with in that country or international organisation, case-law, as well as effective and enforceable data subject rights and effective administrative and judicial redress for the data subjects whose personal data are being transferred;(b) the existence and effective functioning of one or more independent supervisory authorities in the third country or to which an international organisation is subject, with responsibility for ensuring and enforcing compliance with the data protection rules, including adequate enforcement powers, for assisting and advising the data subjects in exercising their rights and for cooperation with the Commissioner; and(c) the international commitments the third country or international organisation concerned has entered into, or other obligations arising from legally binding conventions or instruments as well as from its participation in multilateral or regional systems, in particular in relation to the protection of personal data.”Member’s explanatory statement
This amendment changes the list of things that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Once more unto the breach, my Lords—as opposed to “my friends”.

I will also speak to Amendments 112 to 114, 116 and 130. New Article 45B(2) lists conditions that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects. It replaces the existing conditions in Article 45(2)(a) to (c) of the UK GDPR, removing important considerations such as the impact of a third country’s laws and practices in relation to national security, defence, public security, criminal law and public authority access to personal data on the level of protection provided to UK data subjects.

Despite this shorter list of conditions to consider, the Secretary of State is none the less required to be satisfied that a third country provides a level of protection that is not materially lower than the UK’s. It is plain that such an assessment cannot be made without considering the impact of these factors on the level of protection for UK data in a third country. It is therefore unclear why the amendment that the Government have made to Article 45 is necessary, beyond a desire for the Government to draw attention away from such contentious and complicated issues.

It may be that through rewriting Article 45 of the UK GDPR, the Government’s intention is that assimilated case law on international data transfers is no longer relevant. If that is the case, that would be a substantial risk for UK data adequacy. Importantly, new Article 45B(2) removes the reference to the need for an independent data protection regulator in the relevant jurisdiction. This, sadly, is consistent with the theme of diminishing the independence of the ICO, which is one of the major concerns in relation to the Bill, and it is also an area where the European Commission has expressed concern. The independence of the regulator is a key part of the EU data adequacy regime and is explicitly referenced in Article 8 of the Charter of Fundamental Rights, which guarantees the right to protection of personal data. Amendment 111 restores the original considerations that the Secretary of State must take into account.

Amendments 112 and 113 would remove the proposed powers in Schedules 5 and 6 of the Secretary of State to assess other countries’ suitability for international transfers of data, and place these on the new information commission instead. In the specific context of HIV—the provenance of these amendments is in the National AIDS Trust’s suggestions—it is unlikely that the Secretary of State or their departmental officials will have the specialist knowledge to assess whether there is a risk of harm to an individual by transferring data related to their HIV status to a third country. Given that the activities of government departments are political by their nature, the Secretary of State making these decisions related to the suitability of transfer to third countries may not be viewed as objective by individuals whose personal data is transferred. Many people living with HIV feel comfortable reporting breaches of data protection law in relation to their HIV status to the Information Commissioner’s Office due to its position as an independent regulator, so the National AIDS Trust and others recommend that the Bill places these regulatory powers on the new information commission created by the Bill instead, as this may inspire greater public confidence.

As regards Amendment 114, paragraph 5 of Schedule 5 should contain additional provisions to mandate annual review of the data protection test for each third country to which data is transferred internationally to ensure that the data protection regime in that third country is secure and that people’s personal data, such as their HIV status, will not be shared inappropriately. HIV is criminalised in many countries around the world, and the transfer to these countries of personal data such as an individual’s HIV status could put an individual living with HIV, their partner or their family members at real risk of harm. This is because HIV stigma is incredibly pronounced in many countries, which fosters a real risk of HIV-related violence. Amendment 114 would mandate this annual review.

As regards Amendment 116, new Article 47A(4) to (7) gives the Secretary of State a broad regulation-making power to designate new transfer mechanisms for personal data being sent to a third country in the absence of adequacy regulations. Controllers would be able to rely on these new mechanisms, alongside the existing mechanisms in Article 46 of the UK GDPR, to transfer data abroad. In order to designate new mechanisms, which could be based on mechanisms used in other jurisdictions, the Secretary of State must be satisfied that these are

“capable of securing that the data protection test set out in Article 46 is met”.

The Secretary of State must be satisfied that the transfer mechanism is capable of providing a level of protection for data subjects that is not materially lower than under the UK GDPR and the Data Protection Act. The Government have described this new regulation-making power as a way to future-proof the UK’s GDPR international transfers regime, but they have not been able to point to any transfer mechanisms in other countries that might be suitable to be recognised in UK law, and nor have they set out examples of how new transfer mechanisms might be created.

In addition to not having a clear rationale to take the power, it is not clear how the Secretary of State could be satisfied that a new mechanism is capable of providing the appropriate level of protection for data subjects. This test is meant to be a lower standard than the test for controllers seeking to rely on a transfer mechanism to transfer overseas, which requires them to consider that the mechanism provides the appropriate level of protection. It is not clear to us how the Secretary of State could be satisfied of a mechanism’s capability without having a clear sense of how it would be used by controllers in reality. That is the reason for Amendment 116.

As regards Amendment 130, Ministers have continued all the adequacy decisions that the EU had made in respect of third countries when the UK stopped being subject to EU treaties. The UK also conferred data adequacy on the EEA, but all this was done on a transitional basis. The Bill now seeks to continue those adequacy decisions, but no analysis appears to have been carried out as to whether these jurisdictions confer an adequate level of protection of personal data. This is not consistent with Section 17B(1) of the DPA 2018, which states that the Secretary of State must carry out a review of whether the relevant country that has been granted data adequacy continues to ensure an adequate level of protection, and that these reviews must be carried out at intervals of not more than four years.

In the EU, litigants have twice brought successful challenges against adequacy decisions. Those decisions were deemed unlawful and quashed by the European Court of Justice. It appears that this sort of challenge would not be possible in the UK because the adequacy decisions are being continued by the Bill and therefore through primary legislation. Any challenge to these adequacy decisions could result only in a declaration of incompatibility under the Human Rights Act; it could not be quashed by the UK courts. This is another example of how leaving the EU has diminished the rights of UK citizens compared with their EU counterparts.

As well as tabling those amendments, I support and have signed Amendment 115 in the names of the noble Lords, Lord Bethell and Lord Kirkhope, and I look forward to hearing their arguments in relation to it. In the meantime, I beg to move.

Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, I rise with some temerity. This is my first visit to this Committee to speak. I have popped in before and have been following it very carefully. The work going on here is enormously important.

I am speaking to Amendment 115, thanks to the indulgence of my noble friend Lord Bethell, who is the lead name on that amendment but has kindly suggested that I start the discussions. I also thank the noble Lord, Lord Clement-Jones, for his support. Amendment 115 has one clear objective and that is to prevent transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The word “credible” is important in this amendment.

I thank my noble friend the Minister for his letter of 11 April, which he sent to us to try to mop up a number of issues. In particular, in one paragraph he referred to the question of adequacy, which may also touch on what the noble Lord, Lord Clement-Jones, has just said. The Secretary of State’s powers are also referred to, but I must ask: how, in a fast-moving or unique situation, can all the factors referred to in this long and comprehensive paragraph be considered?

The mechanisms of government and government departments must be thorough and in place to satisfactorily discharge what are, I think, somewhat grand intentions. I say that from a personal point of view, because I was one of those who drafted the European GDPR—another reason I am interested in discussing these matters today—and I was responsible for the adequacy decisions with third countries. The word “adequacy” matters very much in this group, in the same way that we were unable to use “adequacy” when we dealt with the United States and had to look at “equivalence”. Adequacy can work only if one is working to similar parameters. If one is constitutionally looking at different parameters, as is the case in the United States, then the word “equivalence” becomes much more relevant, because, although things cannot be quite the same in the way in which administration or regulation is carried out, if you have an equivalence situation, that can be acceptable and lead to an understanding of the adequacy which we are looking for in terms of others being involved.

I have a marvellous note here, which I am sure noble Lords have already talked about. It says that every day we generate 181 zettabytes of personal data. I am sure noble Lords are all aware of zettabytes, but I will clarify. One zettabyte is 1,000 exabytes—which perhaps makes it simpler to understand—or, if you like, 1 billion trillion bytes. One’s mind just has to get around this, but this is data on our movements, finances, health and families, from our cameras, phones, doorbells and, I am afraid, even from our refrigerators—though Lady Kirkhope refuses point blank to have any kind of detector on her fridge door that will tell anybody anything about us or what we eat. Increasingly, it is also data from our cars. Our every moment is recorded—information relating to everything from shopping preferences to personal fitness to our anxieties, even, as they are displayed or discussed. It is stored by companies that we entrust with that data and we have a right to expect that such sensitive and private data will be protected. Indeed, one of the core principles of data protection, as we all know, is accountability.

Article 79 of the UK GDPR and Section 167 of our Data Protection Act 2018 provide that UK users must have the right to effective judicial remedy in the event of a data protection breach. Article 79 says that

“each data subject shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation”.

--- Later in debate ---
Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

I welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.

I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.

Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.

I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.

Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.

We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.

On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.

Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in a way the Minister is acknowledging that there is a watering down taking place, yet the Government seem fairly relaxed about seeing these issues. If something happens, the Government will do something or other, or the commissioner will. But the Government are proposing to water down Article 45, and that is the essence of what we are all talking about here. We are not satisfied with the current position, and watering down Article 45 will make it even worse; there will be more Yandexes.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.

Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.

I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.

Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.

For these reasons, I hope noble Lords will not press their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.

By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.

The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?

The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.

Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.

Amendment 111 withdrawn.
--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:

“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]

When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.

I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.

I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.

In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:

“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.


Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.

First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.

Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.

Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.

Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to

“provide access to data to vetted researchers”

seeking to carry out

“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.

It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.

Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.

I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.

Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.

I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?

Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.

As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.

When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.

The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?

--- Later in debate ---
Moved by
135A: Clause 28, page 48, line 35, leave out “required” and insert “necessary and proportionate”
Member’s explanatory statement
This amendment would ensure that “proportionality” continues to be considered by competent authorities when they are deciding whether national security exemptions apply to their processing for the purposes of law enforcement.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That was a very good conclusion to the response from the noble Lord, Lord Bethell—urging a Minister to lean in. I have not heard that expression used in the House before, but it is excellent because, faced with a Home Office Minister, I am sure that is the kind of behaviour that we can expect imminently.

Last time we debated issues relating to national security and data protection, the noble Lord, Lord Ashton, was the responsible Minister and I had the support of the noble Lord, Lord Paddick. Now I have the Minister all to myself on Amendments 135A to 135E and the stand part notices on Clauses 28 to 30. These Benches believe that, as drafted, these clauses fall foul of the UK’s obligations under the ECHR, because they give the Home Secretary too broad a discretion and do not create sufficient safeguards to prevent their misuse.

Under the case law of the European Court of Human Rights, laws that give unfettered or overly broad discretion to the Government to interfere with privacy will violate the convention, because the laws must be sufficiently specific to prevent abuses of power. This means they must make sure that, any time they interfere with the privacy of people in the UK, they obey the law, have a goal that is legitimate in a democratic society and do only what is truly necessary to achieving that goal. The court has repeatedly stressed that this is what the rule of law means; it is an essential principle of democracy.

Despite multiple requests from MPs, and from Rights and Security International in particular, the Government have also failed to explain why they believe that these clauses are necessary to safeguard national security. So far, they have explained only why these new powers would be “helpful” or would ensure “greater efficiency”. Those justifications do not meet the standard that the ECHR requires when the Government want to interfere with our privacy. They are not entitled to do just anything that they find helpful.

Under Clause 28(7), the Home Secretary would be able to issue a national security certificate to tell the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey. For instance, a national security certificate would give the police immunity when they commit crimes by using personal data illegally. It would also exempt them from certain provisions of the Freedom of Information Act 2000. The Bill would expand what counts as an intelligence service for the purposes of data protection law—again, at the Home Secretary’s wish. Clause 29 would allow the Home Secretary to issue a designation notice, allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018, otherwise designed for the intelligence agencies whenever they collaborate with the security services.

Both the amended approach to national security certificates and the new designation notice regime would be unaccountable. The courts would not be able to review what the Government are doing and Parliament might therefore never find out. National security certificates are unchallengeable before the courts, meaning that the police and the Home Secretary would be unaccountable if they abused those powers. If the Home Secretary says that the police need to use these increased—and, in our view, unnecessary—powers in relation to national security, his word will be final. This includes the power to commit crimes.

As regards designation notices, the Home Secretary is responsible for approving and reviewing their use. Only a person who is directly affected by a designation notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, in which case how could anybody know that the police had been snooping on their lives under this law?

Clauses 28 to 30 could, in our view, further violate the UK’s obligations under the Human Rights Act 1998 and the European Convention on Human Rights because they remove the courts’ role in reviewing how the Government use their surveillance power. The European Court of Human Rights has ruled in the past that large aspects of the law previously governing the UK’s surveillance powers were unlawful because they gave the Government too much discretion and lacked important safeguards to prevent misuse. Clauses 28 to 30 could be challenged on similar grounds, and the court has shown that it is willing to rule on these issues. These weaknesses in the law could also harm important relationships that the UK has with the EU as regards data adequacy, a subject that we will no doubt discuss in further depth later this week.

The Government argue that the clauses create a simplified legal framework that would improve the efficiency of police operations when working with the intelligence services. This is far from meeting the necessity standard under the ECHR.

The Government have frequently used the Fishmongers’ Hall and Manchester Arena attacks to support the idea that Clauses 28 to 30 are desirable. However, a difference in data protection regimes was not the issue in either case; instead, the problem centred around failures in offender management, along with a lack of communication between the intelligence services and local police. The Government have not explained how Clauses 28 to 30 would have prevented either incident or why they think these clauses are necessary to prevent whatever forms of violence the Government regard as most likely to occur in the future. The Government have had sufficient opportunity to date to explain the rationale for these clauses, yet they have so far failed to do so. For these reasons, we are of the view that Clauses 28 to 30 should not stand part of the Bill.

However, it is also worth putting down amendments to try to tease out additional aspects of these clauses, so Amendments 135A and 135D would put proportionality back in. It is not clear why the word “proportionality” has been taken out of the existing legislation. Similarly, Amendment 135B attempts to put back in the principles that should underpin decisions. Those are the most troubling changes, since they seem to allow for departure from basic data protection principles. These were the principles that the Government, during the passage of the Data Protection Act 2018, assured Parliament would always be secure. The noble Lord, Lord Ashton of Hyde, said:

“People will always have the right to ensure that the data held about them is fair and accurate, and consistent with the data protection principles”.—[Official Report, 10/10/17; col. 126.]


Thirdly, on the introduction of oversight by a judicial commissioner for Clause 28 certificates, now seems a good time to do that. During the passage of the Data Protection Act through Parliament, there was much debate over the Part 2 national security exemption for general processing in Section 26 and the national security certificates in Section 27. We expressed concern then but, sadly, the judicial commissioner role was not included. This is a timely moment to suggest that again.

Finally, on increasing the oversight of the Information Commissioner under Amendment 135E, I hope that this will be an opportunity for the Minister, despite the fact that I would prefer to see Clauses 28 to 30 not form part of the Bill, to explain in greater detail why they are constructed in the way they are and why the Home Office believes that it needs to amend the legislation in the way it proposes. I beg to move.

Lord Anderson of Ipswich Portrait Lord Anderson of Ipswich (CB)
- Hansard - - - Excerpts

My Lords, I come to this topic rather late and without the star quality in this area that has today been attributed to the noble Lord, Lord Kirkhope. I acknowledge both the work of Justice in helping me to understand what Clause 28 does and the work of the noble Lord, Lord Clement-Jones, in formulating the probing amendments in this group. I echo his questions on Clause 28. I will focus on a few specific matters.

First, what is the difference between the existing formulation for restricting data protection rights “when necessary and proportionate” to protect national security and the new formulation,

“when required to safeguard national security”?

What is the purpose of that change? Does “required” mean the same as “necessary” or something different? Do the restrictions not need to be proportionate any more? If so, why? Could we have a practical example of what the change is likely to mean in practice?

Secondly, why is it necessary to expand the number of rights and obligations from which competent law enforcement authorities can be exempted for reasons of national security? I can understand why it may for national security reasons be necessary to restrict a person’s right to be informed, right of access to data or right to be notified of a data breach, as under the existing law, but Clause 28 would allow the disapplication of some very basic principles of data protection law—including, as I understand it, the right to have your data processed only for a specified, explicit and legitimate purpose, as well as the right to have decisions made about you not use solely automated methods.

Thirdly, as the noble Lord, Lord Clement-Jones, asked, why is it necessary to remove the powers of the Information Commissioner to investigate, to enter and inspect, and, where necessary, to issue notices? I appreciate that certificates will remain appealable to the Upper Tribunal by the person directly affected, applying judicial review principles, but that is surely not a substitute for review by the skilled and experienced ICO. Apart from anything else, the subject is unlikely even to know that they have been affected by the provisions, given that a certificate would exempt law enforcement from having to provide information to them. That is precisely why the oversight of a commissioner in the national security area is so important.

As for Clauses 29 and 30, I am as keen as anybody to improve the capabilities for the joint processing of data by the police and intelligence agencies. That was a major theme of the learning points from the London and Manchester attacks of 2017, which I helped to formulate in that year and on which I reported publicly in 2019. A joint processing regime certainly sounds like a good idea in principle but I would be grateful if the Minister could confirm which law enforcement competent authorities will be subject to this new regime. Are they limited to Counter Terrorism Policing and the National Crime Agency?

--- Later in debate ---
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

Perhaps the same correspondence could cover the point I raised as well.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am immensely grateful to the noble Lords, Lord Anderson and Lord Bassam, for their interventions. In particular, given his background, if the noble Lord, Lord Anderson, has concerns about these clauses, we all ought to have concerns. I am grateful to the Minister for the extent of his unpacking—or attempted unpacking—of these clauses but I feel that we are on a slippery slope here. I feel some considerable unease about the widening of the disapplication of principles that we were assured were immutable only six years ago. I am worried about that.

We have had some reassurance about the right to transparency, perhaps when it is convenient that data subjects find out about what is happening. The right to challenge was also mentioned by the Minister but he has not really answered the question about whether the Home Office has looked seriously at the implications as far as the human rights convention is concerned, which is the reason for the stand part notice. The Minister did not address that matter at all; I do not know why. I am assuming that the Home Office has looked at the clauses in the light of the convention but, again, he did not talk about that.

The only assurance the Minister has really given is that it is all on a case-by-case basis. I do not think that that is much of a reassurance. On the proportionality point made by the noble Lord, Lord Anderson, I think that we are going to be agog in waiting for the Minister’s correspondence on that, but it is such a basic issue. There were two amendments specifically on proportionality but we have not really had a reply on that issue at all, in terms of why it should have been eliminated by the legislation. So a feeling of unease prevails. I do not even feel that the Minister has unpacked fully the issue of joint working; I think that the noble Lord, Lord Anderson, did that more. We need to know more about how that will operate.

The final point that the Minister made gave even greater concern—to think that there will be an SI setting out the bodies that will have the powers. We are probably slightly wiser than when we started out with this group of amendments, but only slightly and we are considerably more concerned. In the meantime, I beg leave to withdraw the amendment.

Amendment 135A withdrawn.
--- Later in debate ---
I hope we can all agree that it is crucial we do everything we can to ease the administrative burdens on our police forces so that we can free up thousands of policing hours and get police back on to the front line, supporting communities and tackling crime. This amendment would go a long way to achieving this by facilitating the free flow of personal data between the police and the CPS at a very specific time in the pre-charge process. It would speed up the criminal justice process and reduce the burden on the taxpayer. I very much look forward to hearing from the Minister why Amendment 137 is not deemed acceptable at this stage, and I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the noble Baroness, Lady Morgan, has done us a service by raising this issue. My question is about whether the advice given to date about redaction is accurate. I have not seen the Home Office’s guidance or counsel’s analysis. I have taken advice on the Police Federation’s case—I received an email and I was very interested in what it had to say, because we all want to make sure that the bureaucracy involved in charging and dealing with the CPS is as minimal as possible within the bounds of data protection law.

Section 35(2)(b) of the Data Protection Act simply requires the police to ensure that their processing is necessary for the performance of their tasks. You would have thought that sending an investigation file to the CPS to decide whether to charge a suspect seems necessary for the performance of that task. Some of that personal data may end up not being relevant to the charge or any trial, but that is a judgment for the CPS and the prosecutor. It does not mean, in the view of those I have consulted, that the file has to be redacted at vast taxpayer cost before the CPS or prosecutor have had a chance to see the investigation’s file. When you look at sensitive data, the test is “strictly necessary”, which is a higher test, but surely the answer to that must be that officers should collect this information only where they consider it relevant to the case. So this can be dealt with through protocols about data protection, which ensure that officers do not collect more sensitive data than is necessary for the purposes of the investigation.

Similarly, under Section 37, the question that the personal data must be adequate, relevant and not excessive in relation to the purpose for which it is processed should not be interpreted in such a way that this redaction exercise is required. If an officer thinks they need to collect the relevant information for the purpose of the investigation, that seems to me—and to those advising me—in broad terms to be sufficient to comply with the principle. Conversely, if officers are collecting too much data, the answer is that they should be trained to avoid doing this. If officers really are collecting more information than they should be, redactions cannot remedy the fact that the collection was unlawful in the first place. The solution seems to be to stop them collecting that data.

I assume—maybe I am completely wrong—that the Minister will utter “suitable guidance” in response to the noble Baroness’s amendment and say that there is no need to amend the legislation, but, if there is no need to do so, I hope that they revise the guidance, because the Police Federation and its members are clearly labouring under a misapprehension about the way the Act should be interpreted. It would be quite a serious matter if that has taken place for the last six years.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, we should be very grateful to the noble Baroness, Lady Morgan of Cotes, for her amendment. I listened very carefully to her line of argument and find much that we can support in the approach. In that context, we should also thank the Police Federation of England and Wales for a particularly useful and enlightening briefing paper.

We may well be suffering under the law of unintended consequences in this context; it seems to have hit quite hard and acted as a barrier to the sensible processing and transfer of data between two parts of the law enforcement machinery. It is quite interesting coming off the back of the previous debate, when we were discussing making the transfer of information and intelligence between different agencies easier and having a common approach. It is a very relevant discussion to have.

I do not think that the legislation, when it was originally drafted, could ever have been intended to work in the way the Police Federation has set out. The implementation of the Data Protection Act 2018, in so far as law enforcement agencies are concerned, is supposed to be guided by recital 4, which the noble Baroness read into the record and which makes good sense.

As the noble Baroness explained, the Police Federation’s argument that the DPA makes no provisions at all that are designed to facilitate, in effect, the free flow of information, that it should be able to hold all the relevant data prior to the charging decision being made by the CPS, and that redaction should take place only after a decision on charging has been made seems quite a sensible approach. As she argued, it would significantly lighten the burden on police investigating teams and enable the decision on charging to be more broadly informed.

So this is a piece of simplification that we can all support. The case has been made very well. If it helps speed up charging and policing processes, which I know the Government are very concerned about, as all Governments should be, it seems a sensible move—but this is the Home Office. We do not always expect the most sensible things to be delivered by that department, but we hope that they are.

Lord Sharpe of Epsom Portrait Lord Sharpe of Epsom (Con)
- Hansard - - - Excerpts

I thank all noble Lords for their contributions—I think. I thank my noble friend Lady Morgan of Cotes for her amendment and for raising what is an important issue. Amendment 137 seeks to permit the police and the Crown Prosecution Service to share unredacted data with one another when making a charging decision. Perhaps to the surprise of the noble Lord, Lord Bassam, we agree: we must reduce the burden of redaction on the police. As my noble friend noted, this is very substantial and costly.

We welcome the intent of the amendment. However, as my noble friend has noted, we do not believe that, as drafted, it would achieve the stated aim. To fully remove it would require the amendment of more than just the Data Protection Act.

However, the Government are committed to reducing the burden on the police, but it is important that we get it right and that the solution is comprehensive. We consider that the objective which my noble friend is seeking would be better achieved through other means, including improved technology and new, simplified guidance to prevent overredaction, as all speakers, including the noble Lord, Lord Clement-Jones, noted.

The Home Office provided £960,000 of funding for text and audio-visual multimedia redaction in the 2023-24 financial year. Thanks to that funding, police forces have been able to procure automated text redaction tools, the trials of which have demonstrated that they could save up 80% of the time spent by the police on this redaction. Furthermore, in the latest Budget, the Chancellor announced an additional £230 million of funding for technology to boost police productivity. This will be used to develop, test and roll out automated audio-visual redaction tools, saving thousands more hours of police time. I would say to my noble friend that, as the technology improves, we hope that the need for it to be supervised by individuals will diminish.

I can also tell your Lordships’ House that officials from the Home Office have consulted with the Information Commissioner’s Office and have agreed that a significant proportion of the burden caused by existing pre-charge redaction processes could be reduced safely and lawfully within the current data protection framework in a way that will maintain standards and protections for individuals. We are, therefore, actively working to tackle this issue in the most appropriate way by exploring how we can significantly reduce the redaction burden at the pre-charge stage through process change within the existing legislative framework. This will involve creating simplified guidance and, obviously, the use of better technology.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Is the Minister almost agreeing with some of my analysis in that case?

Lord Sharpe of Epsom Portrait Lord Sharpe of Epsom (Con)
- Hansard - - - Excerpts

No, I think I was agreeing with my noble friend’s analysis.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

It does not sound like that to me.

Lord Sharpe of Epsom Portrait Lord Sharpe of Epsom (Con)
- Hansard - - - Excerpts

I thank all noble Lords for their contributions. We acknowledge this particular problem and we are working to fix it. I would ask my noble friend to withdraw her amendment.

--- Later in debate ---
Moved by
138: Clause 31, page 56, leave out lines 10 to 14 and insert—
“(a) to monitor the application of GDPR, the applied GDPR and this Act, and ensure they are fully enforced with all due diligence;(b) to act upon receiving a complaint, to investigate, to the extent appropriate, the subject matter of the complaint, and to take steps to clarify unsubstantiated issues before dismissing the complaint.”Member’s explanatory statement
This amendment clarifies the statutory objective of the Commissioner by removing secondary objectives introduced by the Bill and clarifying role and responsibility of the Commissioner.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will speak also to Amendment 140 and the submissions that Clauses 32 to 35 should not stand part. These amendments are designed to clarify the statutory objective of the new information commission; increase its arm’s-length relationship with the Government; allow effective judicial scrutiny of its regulatory function; allow not-for-profit organisations to lodge representative complaints; retain the Office of the Biometrics and Surveillance Camera Commissioner; and empower the Equality and Human Rights Commission to scrutinise the new information commission. The effective supervision and enforcement of data protection and the investigation and detection of offenders are crucial to achieve deterrence, prevent violations, maintain transparency and control options for redress against data misuse.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am very happy to try to find a way forward on this. Let me think about how best to take this forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.

We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.

I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.

Amendment 138 withdrawn.
--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.

I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.

We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.

In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.

The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.

I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.

As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.

From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.

Moved by
53: Clause 14, page 27, line 21, leave out “is, or”
Member’s explanatory statement
This amendment, along with others in the name of Lord Clement-Jones, would retain the ability of the Secretary of State to introduce new safeguards but would prevent the removal or variation of safeguards under the new UK GDPR Article 22D and the new section 50D of the 2018 Act.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.

The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.

The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.

New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.

If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.

As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.

Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:

“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]


That is very much my feeling about the clause as well.

I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.

In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.

Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.

Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.

There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.

The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:

“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]


but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.

The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?

As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?

I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?

On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.

I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like

“a cog in a machine or an Amazon worker with no agency or creativity”.

He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.

--- Later in debate ---
The ICO also monitors the effects of AI on people and society using sources including its own casework, stakeholder engagement and wider intelligence gathering. The ICO is currently looking at how it might update its guidance on AI to improve its usability and is committed to incorporating any changes needed as a result of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.

On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—

--- Later in debate ---
Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, I think we should try to let the Minister make a little progress and see whether some of these questions are answered.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.

On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.

Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.

Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.

Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.

Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.

--- Later in debate ---
The Government take the view that these reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way, while driving economic growth and innovation. The Government want to provide the necessary future-proofing measures in an evolving technology landscape. This is why the Bill has been carefully designed to provide a future-proofed and flexible data protection regime for the UK. I therefore ask noble Lords not to oppose the question that Clause 14 stand part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I feel less reassured after this debate than I did even at the end of our two groups on Monday. I thank all those who spoke in this debate. There is quite a large number of amendments in this group, but a lot of them go in the same direction. I was very taken by what the noble Baroness, Lady Kidron, said: if the Government are offering safeguards and not rights, that is really extremely worrying. I also very much take on board what the noble Baroness, Lady Harding, had to say. Yes, of course we are in favour of automated decision-making, as it will make a big difference to our public services and quite a lot of private businesses, but we have to create the right ground rules around it. That is what we are talking about. We all very much share the question of children having a higher bar. The noble Baroness, Lady Jones, outlined exactly why the Secretary of State’s powers either should not be there or should not be expressed in the way that they are. I very much hope that the Minister will write on that subject.

More broadly, there are huge issues here. I think that it was the noble Baroness, Lady Kidron, who first raised the fact that the Government seem to be regulating in a specific area relating to AI that is reducing rights. The Minister talks about now regulating outcomes, not process. As the noble Baroness, Lady Jones, said, we do not have any criteria—what KPIs are involved? The process is important—the ethics by which decisions are made and the transparency involved. I cannot see that it is simply about whether the outcome is such and such; it is about the way in which people make decisions. I know that people like talking about outcome-based regulation, but it is certainly not the only important aspect of regulation.

On the issue of removing prescriptiveness, I am in favour of ethical prescriptiveness, so I cannot see that the Minister has made a particularly good case for the changes made under Clause 14. He talked about having access to safeguards when they matter most. It would be far preferable to have rights that can be exercised in the face of automated decision-making, in particular workplace protection. At various points during the debates on the Bill we have touched on things such as algorithmic impact assessment in the workplace and no doubt we will touch on it further. That is of great and growing importance, but again there is no recognition of that.

I am afraid that the Minister has not made a fantastic case for keeping Clause 14 and I think that most of us will want to kick the tyres and carry on interrogating whether it should be part of the Bill. In the meantime, I beg leave to withdraw Amendment 53.

Amendment 53 withdrawn.
--- Later in debate ---
Moved by
74: After Clause 14, insert the following new Clause—
“Use of the Algorithmic Transparency Recording Standard(1) The Secretary of State must by regulations make provision requiring Government departments, public authorities and all persons exercising a public function using algorithmic tools to process personal data to use the Algorithmic Transparency Recording Standard (“the Standard”).(2) The Standard is that published by the Central Digital and Data Office and Centre for Data Ethics and Innovation as part of the Government’s National Data Strategy.(3) Regulations under subsection (1) must require the submission and publication of algorithmic transparency reports as required by the Standard.(4) Regulations under subsection (1) may provide for exemptions to the requirement for publication where necessary—(a) to avoid obstructing an official or legal inquiry, investigation or procedure, (b) to avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties,(c) to protect public security, or(d) to safeguard national security.(5) Regulations under subsection (1) are subject to the affirmative resolution procedure.”Member’s explanatory statement
This new Clause puts a legislative obligation on public bodies using algorithmic tools that have a significant influence on a decision-making process with direct or indirect public effect, or directly interact with the general public, to publish reports under the Algorithmic Transparency Recording Standard.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.

The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.

We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.

We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.

Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that

“automated decision systems … are responsible and minimise harm to individuals and society at large”.

The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are

“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.

This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.

It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.

This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.

Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I feel under amazing pressure to get the names right, especially given the number of hours we spend together.

I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.

Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.

The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.

In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.

More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.

For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.

On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to

“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.

It is already explicit that:

“Activities addressed specifically to children shall receive specific attention”.


That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.

Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.

There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response. There are a number of different elements to this group.

The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.

--- Later in debate ---
Finally, Amendments 105 and 107 would reinstate and reinforce the reporting requirement on controllers in the event that the children’s data protection impact assessment, as proposed in Amendment 96, requires the controller to consult with the commissioner because the processing is high risk. Amendment 105 is consequential, while Amendment 107 is substantive. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.

I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.

Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.

The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.

The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.

On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.

There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.

Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.

The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.

I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.

At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.

Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:

“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.


However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.

The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.

One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.

The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.

However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.

One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.

Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.

Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.

However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.

I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.

By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.

I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.

The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.

Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.

I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.

The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.

Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.

Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I would like to hear from the Minister.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Yes. We will not stand on ceremony.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

As long as that applies to us on occasion as well.

--- Later in debate ---
Debate on whether Clause 19 should stand part of the Bill.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, just in passing, I will say that I am beginning to feel that the decision made by the Privileges Committee and now the House is beginning to creak in terms of the very first Grand Committee that it has encountered. So, in terms of time limits, I think flexibility on Grand Committee in particular is absolutely crucial. I am afraid that the current procedures will not necessarily stand the test of time—but we shall see.

This is a relatively short debate on whether Clause 19 should stand part, but it is a really significant clause, and it is another non-trust-engendering provision. This basically takes away the duty of the police to provide justification for why they are consulting or sharing personal data. Prompted by the National AIDS Trust, we believe that the Bill must retain the duty on police forces to justify why they have accessed an individual’s personal data.

This clause removes an important check on police processing of an individual’s personal data. The NAT has been involved in cases of people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and within the wider communities that they serve. Therefore, ensuring that police officers justify why they have accessed an individual’s personal data is vital evidence in cases of police misconduct. Such cases include when a person’s HIV status is shared inappropriately by the police, or when it is not relevant to an investigation of criminal activity.

The noble Baroness, Lady Kidron, was extremely eloquent in her winding up of the last group. The Minister really needs to come back and tell us what on earth the motivation is behind this particular Clause 19. I beg to move that this clause should not stand part of the Bill.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.

The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.

In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.

Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.

I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.

This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.

The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.

I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.

Clause 19 agreed.
--- Later in debate ---
The reality is that, without an obligation for these scrapers to provide transparency of who they are, the identity of their scraper, and the purpose for which they are scraping before the activity takes place, it is currently impossible for almost anyone, whether a data controller or data subject, to express their information rights. When the noble Lord the Minister responds, I hope that he will acknowledge that it is neither proportionate nor practical to ask the general public or small business to undertake a computer science degree or equivalent in order to access their data rights, and that the widespread abuse of UK data rights by web scraping without permission undermines the very purpose of the legislation. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, given the hour, I will be brief. That was an absolute tour de force by the noble Baroness. As with all the Minister’s speeches, I will read her speech over Easter.

I was very interested to be reminded of the history of Napster, because that was when many of us realised that we were, in many ways, entering the digital age in the creative industries and beyond. The amendments that the noble Baroness put forward are examples of where the Bill could make a positive impact, unlike the impact that so much of the rest of it is making in watering down rights. She described cogently how large language models are ingesting or scraping data from the internet, social media and journalism, how very close to the ingestion of copyright material this whole agenda is and how it is being done by anonymous bots in particular. It fits very well with the debate in which the Minister was involved last Friday on the Private Member’s Bill of the noble Lord, Lord Holmes, who inserted a clause requiring transparency on the ingestion or scraping of data and copyright material by large language models. It is very interesting.

The opportunity in the data area is currently much greater than it is in the intellectual property area. At least we have the ICO, which is a regulator, unlike the IPO, which is not really a regulator with teeth. I am very interested in the fact that the ICO is conducting a consultation on generative AI and data protection, which it launched in January. Conterminously with this Bill, perhaps the ICO might come to some conclusions that we can use. That would of course include the whole area of biometrics, which, in the light of things such as deepfakes and so on, is increasingly an issue of great concern. The watchword is “transparency”: we must impose a duty on the generative AI models about the use of the material that they use to train their models and then use in operation. I fully support Amendments 103 and 104 in the name of the noble Baroness, even though, as she describes them, they are a small step.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.

As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.

What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.

It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.

I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?

In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.

Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.

I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.

We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.

We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.

In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.

Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.

We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.

Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.

In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.

I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.

Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I too can be brief, having heard the Minister’s response. I thought he half-shot the Clement-Jones fox, with very good aim on the Minister’s part.

I was simply going to say that it is one in a sea of amendments from the Government, but the noble Lord, Lord Clement-Jones, made an important point about making sure that the country organisations that the commissioner looks at should meet the test of data adequacy—I also had that in my speaking note. The noble Lord, Lord Clement-Jones, was making a good point in terms of ensuring that appropriate data protections are in place internationally for us to be able to work with.

The Minister explained the government amendments with some care, but I wonder if he could explain how data transfers are made to an overseas processor using the powers relied on by reference to new Section 73(4)(aa) of the 2018 Act. The power is used as a condition and justification for several of the noble Lord’s amendments, and I wonder whether he has had to table these amendments because of the original drafting. That would seem to be to be the most likely reason.

Moved by
11: Clause 5, page 6, line 15, at end insert—
“(za) After point (a) insert—“(aa) the data subject has given consent for his or her personal data to enter the public domain via a public body;(ab) processing is carried out by a public body pursuant to a legal or statutory obligation or right, and the public body is entitled to make such data available to the public;””Member’s explanatory statement
This amendment would add to the list of GDPR Article 6(1) on the lawfulness of processing.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.

Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.

Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,

“among other things … the interests and fundamental rights and freedoms of data subjects”.

The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.

Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:

“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]


I entirely agree with that.

The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.

The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.

As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.

Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data

“to enter the public domain via a public body”,

or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.

On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.

We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.

On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,

“intra-group transmission of personal data”

in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.

As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.

Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.

There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.

I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.

It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.

The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.

I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.

On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:

“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]

I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The balancing test remains there for legitimate interests, under Article 6(1)(f).

Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.

Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.

I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.

Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.

Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.

Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.

The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.

For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.

I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.

--- Later in debate ---
This is a fundamental group of amendments. It takes quite a lot for me to stand up on something so party political—I think my husband will be completely horrified that I did this homework over the weekend—but I ask the Minister to reconsider and to listen hard to the considered views, probably more considered than mine, on the Opposition Benches calling for more consultation before something such as this is introduced.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government

“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]

Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is

“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.

That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.

Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.

Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.

All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.

The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.

On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.

The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.

On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—

--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?

--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:

“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.


I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.

Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.

I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.

I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

In which case, you are warned.

Amendment 20 agreed.
--- Later in debate ---
As a result, I have no idea what information the Government now hold about me. Under this clause, it would be so easy for someone to deny that information to me if I requested it. If people such as me cannot access personal data, it will be almost impossible for us to exercise our right to call for the erasure of that data. I cannot ask anyone to delete that data if someone refuses to give it to me. I urge the Minister to withdraw this clause, as it is an affront to human rights and public accountability.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.

A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.

There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.

Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as

“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.

The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.

Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.

In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:

“The new parameters are not intended to be reasons for refusal”,


but rather to give

“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]

But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.

We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a

“reasonable and proportionate search for the personal data and other information”.

This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.

Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate

“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]

Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.

We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.

We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.

--- Later in debate ---
Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

From looking at the wording of the Members’ explanatory statements for wishing to leave out Clauses 9 and 36, I do not think that the Minister has addressed this, but does he accept that the Bill now provides a more lax approach? Is this a reduction of the standard expected? To me, “vexatious or excessive” sounds very different from “manifestly unfounded or excessive”. Does he accept that basic premise? That is really the core of the debate; if it is not, we have to look again at the issue of resources, which seems to be the argument to make this change.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

If that is the case and this is a dilution, is this where the Government think they will get the savings identified in the impact assessment? It was alleged in the Public Bill Committee that this is where a lot of the savings would come from—we all have rather different views. My first information was that every SME might save about £80 a year then, suddenly, the Secretary of State started talking about £10 billion of benefit from the Bill. Clarification of that would be extremely helpful. There seems to be a dichotomy between the noble Lord, Lord Bassam, saying that this is a way to reduce the burdens on business and the Minister saying that it is all about confident refusal and confidence. He has used that word twice, which is worrying.

Lord Sikka Portrait Lord Sikka (Lab)
- Hansard - - - Excerpts

I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, can we join in with the request to see that information in a letter? We would like to see where these savings will be made and how much will, as noble Lords have said, be affected by the clauses that we are debating today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not quite sure what is being requested because the impact assessment has been not only made but published.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Yes, but it is a very broad impact assessment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That begs the question of where on earth the rest is coming from.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.

I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.

Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.

On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—

--- Later in debate ---
Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, I support Amendments 27 to 34, tabled variously by my noble friend Lady Harding, and the noble Lord, Lord Clement-Jones, to which I have added my name. As this is the first time I have spoken in Committee, I declare my interests as deputy chairman of the Telegraph Media Group and president of the Institute of Promotional Marketing and note my other declarations in the register.

The direct marketing industry is right at the heart of the data-driven economy, which is crucial not just to the future of the media and communications industries but to the whole basis of the creative economy, which will power economic growth into the future. The industry has quite rightly welcomed the Bill, which provides a long-term framework for economic growth as well as protecting customers.

However, there is one area of great significance, as my noble friend Lady Harding has just eloquently set out, on which this Bill needs to provide clarity and certainty going forward, namely, the use of the open electoral register. That register is an essential resource for a huge number of businesses and brands, as well as many public services, as they try to build new audiences. As we have heard, it is now in doubt because of a recent legal ruling that could, as my noble friend said, lead to people being bombarded with letters telling them that their data on the OER has been used. That is wholly disproportionate and is not in the interests of the marketing and communications industry or customers.

These sensible amendments would simply confirm the status quo that has worked well for so long. They address the issue by providing legal certainty around the use of the OER. I believe they do so in a proportionate manner that does not in any way compromise any aspect of the data privacy of UK citizens. I urge the Minister carefully to consider these amendments. As my noble friend said, there are considerable consequences of not acting for the creative economy, jobs in direct marketing, consumers, the environment and small businesses.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am extremely grateful to the noble Baroness, Lady Harding, and the noble Lord, Lord Black, for doing all the heavy lifting on these amendments. I of course support them having put forward my own amendments. It is just the luck of the draw that the noble Baroness, Lady Harding, put forward her amendment along with all the others. I have very little to say in this case, and just echo what the noble Lord, Lord Black, said about the fact that the open electoral register has played an important part in the direct marketing, data-driven economy, as it is described. It is particularly interesting that he mentioned the creative industries as well.

The First-tier Tribunal precedent could impact on other public sources of data, including the register of companies, the register of judgments, orders and fines, the land register and the food standards agency register. It could have quite far-reaching implications unless we manage to resolve the issue. There is a very tight timescale. The First-tier Tribunal’s ruling means that companies must notify those on the electoral register by 20 May or be at risk of breaching the law. This is really the best route for trying to resolve the issue. Secondly, the First-tier Tribunal’s ruling states that costs cannot be considered as disproportionate effort. That is why these amendments explicitly refer to that. This is no trivial matter. It is a serious area that needs curing by this Bill, which is a good opportunity to do so.

I shall speak briefly to Clause 11 as a whole standing part. That may seem a bit paradoxical, but it is designed to address issues arising in Article 13, not Article 14. Article 13 of the UK GDPR requires controllers, where they intend to process data that was collected directly from data subjects—as opposed to Article 14 obligations, which apply to personal data not obtained from the data subject—for a new purpose, to inform data subjects of various matters to the extent necessary,

“to ensure fair and transparent processing”.

Clause 11(1) removes this obligation for certain purposes where it would require disproportionate effort. The obligation is already qualified to what is necessary to make processing fair and transparent, the fundamental requirements of the GDPR. If, in these circumstances, processing cannot be made fair and transparent without disproportionate effort, then it should not take place. Clause 11(1) would sidestep the requirement and allow unfair, untransparent processing to go ahead for personal data that the data controllers had themselves collected. Perhaps I should have tabled a rather more targeted amendment, but I hope that noble Lords get the point of the difference between this in terms of Article 13 and Article 14.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.

On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.

Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.

The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.

Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.

Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.

I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:

“Clause 14 risks eroding trust in AI”.


That would be a very sad outcome.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.

The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.

The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.

The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.

I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.

Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.

Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.

--- Later in debate ---
I shall return briefly to the rollout of the Algorithmic Transparency Reporting Standard. To date, we have taken a deliberatively iterative and agile approach on ATRS development and rollout with the intention of generating buy-in from departments, gathering feedback, informing the evidence base, and improving and adapting the standard.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That means no compliance mechanism.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.

I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.

With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.

Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.

I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if

“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,

taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.

In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.

It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.

Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.

I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.

Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.

Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.

Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.

In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.

Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.

Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.

Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.

The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.

Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.