13 Lord Kamall debates involving the Department for Science, Innovation & Technology

Large Language Models and Generative AI (Communications and Digital Committee Report)

Lord Kamall Excerpts
Thursday 21st November 2024

(1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, I refer noble Lords to my interests as set out in the register. I also thank the committee staff for their work during the inquiry and in writing the report, all the witnesses who offered us a range of views on this fascinating topic, as well as our incredibly able chairperson the noble Baroness, Lady Stowell, and my committee colleagues.

I am someone who studied engineering for my first degree, so I will go back to first principles. Large language models work by learning relationships between pieces of data contained in large datasets. They use that to predict sequences, which then enables them to generate natural language text, such as articles, student essays, or even politicians’ speeches—but not this one. Finance companies use LLMs to predict market trends based on past data; marketing agencies use LLMs to analyse consumer behaviour in developing marketing campaigns; and, in health, LLMs have been used to analyse patient records and clinical notes to help diagnosis and develop treatment plans.

While this is welcome, LLMs also hallucinate. They produce a result that seems plausible but is in fact false, since the LLM’s source data in calculating the probability of that information being correct was actually incorrect. An AI expert told me that all output from an LLM, whether accurate or not, could be considered an hallucination, since the LLM itself possesses no real knowledge or intelligence. In fact, so much of what we call artificial intelligence at the moment is not yet intelligent and can be better described as machine learning. Given this, we should be careful to put too much trust in LLMs and AI, especially in automated decision-making.

In other debates, I have shared my terrible customer experiences with an airline and a fintech company, both of which seemed to use automated decision-making, but I will not repeat them now. While they got away with it, poor automated decision-making in healthcare could be dangerous and even catastrophic, leading to deaths. We need to proceed with caution when using LLMs and other AI systems for automated decision-making, something that will be raised in debate on the Data (Use and Access) Bill. We also need to consider safeguards and, possibly, an immediate human back-up on site if something goes wrong.

These examples about the good and the bad highlight the two key principles in technology legislation and regulation. You have, on the one hand, the precautionary principle and, on the other, the innovation principle. Witnesses tended to portray the US approach, certainly at the federal level, as driven mostly by the large US tech companies, while the European Union’s AI Act was described as “prescriptive”, overly precautionary and “stifling innovation”. Witnesses saw this as an opportunity for the UK to continue to host world-leading companies but, as other noble Lords have said, we cannot delay. Indeed, the report calls for the UK Government and industry to prepare now to take advantage of the opportunities, as the noble Baroness, Lady Wheatcroft, said.

At the same time, the Government should guard against regulatory capture or rent seeking by the big players, who may lobby for regulations benefiting them at a cost to other companies. We also considered the range of, and trade-offs between, open and closed models. While open models may offer greater access and competition, they may make it harder to control the proliferation of dangerous capabilities. While closed models may offer more control and security, they may give too much power to the big tech companies. What is the Government’s current thinking on the range of closed and open models and those in between? Who are they consulting to inform this thinking?

The previous Government’s AI Safety Summit was welcomed by many, and we heard calls to address the immediate risks from LLMs since malicious activities, such as fake pictures and fake news, become easier and cheaper with LLMs, as the noble Baroness, Lady Healy said. As the noble Lords, Lord Knight and Lord Strasburger, said, witnesses told us about the catastrophic risks, which are defined as about 1,000 UK deaths and tens of billions in financial damages. They believe that these were unlikely in the next few years, but not impossible, as next-generation capabilities come online. Witnesses suggested mandatory safety tests for high-risk, high-impact models. Can the Minister explain the current Government’s thinking on mandatory safety tests, especially for the high-risk, high-impact models?

At the same time, witnesses warned against a narrative of AI being mostly a safety issue. They wanted the Government to speak more about innovation and opportunity, and to focus on the three pillars of AI. The first is data training and evaluation; the second is about algorithms and the talent to write, and perhaps to rewrite, them; and the third is computing power. As other noble Lords have said, they criticise the current Government’s decision to scrap the exascale supercomputer announced by the previous Government. Can the Minister explain where he sees the UK in relation to each of the three pillars, especially on computing power?

As the noble Baroness, Lady Featherstone, and others have said, one of the trickiest issues we discussed was copyright. Rights holders want the power to check whether their data is used without their permission. At least one witness questioned whether this was technically possible. Some rights holders asked for investment in high-quality training datasets to encourage LLMs to use licensed material. In contrast, AI companies distinguished between inputs and outputs. For example, an input would be if you listened to lots of music to learn to play the blues guitar. AI companies argue that this is analogous to using copyrighted data for training. For them, an output would be if a musician plays a specific song, such as “Red House” by Jimi Hendrix. The rights holders would then be compensated, even though poor Jimi is long dead. However, rights holders criticised this distinction, arguing that it undermines the UK’s thriving creative sector, so you can see the challenge that we have. Can the Minister share the Government’s thinking on copyright material as training data?

For the overall regulatory framework, the Government have been urged to empower sector regulators to regulate proportionally, considering the careful balance and sometimes trade-off between innovation and precaution. Most witnesses recommended that the UK forge its own path on AI regulation—fewer big corporations than the US but more flexible than the EU. We should also be aware that technology is usually ahead of regulation. If you try too much a priori legislation, you risk stifling innovation. At the same time, no matter now libertarian you may be, when things go wrong voters expect politicians and regulators to act.

To end, can the Minister tell the House whether the Government plan to align with the US’s more corporate approach or the EU’s less innovative AI regulation, or to forge an independent path so that the UK can be a home for world-leading LLM and AI companies?

Finally, our Amendment 195 picks up another recommendation of the Delegated Powers and Regulatory Reform Committee, which relates to setting fees for people seeking entry or modifying details on the register of providers of verification services. The powers are given to the Secretary of State to set these fees with no reference to Parliament. The Delegated Powers Committee recommends that there should be parliamentary scrutiny using the negative procedure. We agree with this point and this is reflected in our amendment. I therefore beg to move Amendment 177.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I speak in favour of Amendment 195ZA in my name and that of the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, and Amendments 289 and 300 on digital identity theft. I am also very sympathetic to many of the points made by the noble Baroness, Lady Jones of Whitchurch, particularly about the most disadvantaged people in our society.

As many noble Lords know, I am a member of the Communications and Digital Committee of this House. A few months ago, we did a report on digital exclusion. We had to be quite clear about one of the issues that we found: even though some people may partly use digital—for example, they may have an email address—it does not make them digitally proficient or literate. We have to be very clear that, as more and more of our public and private services go online, it is obvious that companies and others will want to know which people are claiming to use these services. At the same time, a number of people will not be digitally literate or will not have this digital ID available. It is important that we offer them enough alternatives. It should be clear, and not beyond the wit of man or clever lawyers, that there are non-digital alternatives available for consumers and particularly, as was said by the noble Baroness, Lady Jones of Whitchurch, people from disadvantaged communities.

As we found in the report on our inquiry into digital exclusion, this does not concern only people from deprived areas. Sometimes people get by in life without much digital literacy. There are those who may be scared of it or who do not trust it, and they can come from all sorts of wealth brackets. This drives home the point that it is important to have an alternative. I cannot really say much more than the amendment itself; it does what it says on the tin. The amendment is quite clear and I am sure that the noble Lord, Lord Vaux, will speak to it as well.

I will briefly speak in favour of Amendments 289 and 300. Digital identity theft is clearly an issue and has been for a long time. Even before the digital days, identity theft was an issue and it is so much easier to hack someone’s ID these days. I have had bank accounts opened in my name. I received a letter claiming this but, fortunately, the bank was able to deal with it when I walked in and said, “This wasn’t me”. It is quite clear that this will happen more and more. Sometimes, it will simply be stealing data that has been leaked or because a system is not particularly secure; at other times, it will be because you have been careless. No matter why the crime is committed, it must be an offence in the terms suggested by the amendments of the noble Lord, Lord Clement-Jones. It is clear that we have to send a strong signal that digital identity theft is a crime and that people should be deterred from engaging in it.

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I have added my name to Amendment 195ZA—I will get to understand where these numbers come from, at some point—in the name of the noble Lord, Lord Kamall, who introduced it so eloquently. I will try to be brief in my support.

For many people, probably most, the use of online digital verification will be a real benefit. The Bill puts in place a framework to strengthen digital verification so, on the whole, I am supportive of what the Government are trying to do, although I think that the Minister should seriously consider the various amendments that the noble Baroness, Lady Jones of Whitchurch, has proposed to strengthen parliamentary scrutiny in this area.

However, not everyone will wish to use digital verification in all cases, perhaps because they are not sufficiently confident with technology or perhaps they simply do not trust it. We have already heard the debates around the advances of AI and computer-based decision-making. Digital identity verification could be seen to be another extension of this. There is a concern that Part 2 of the Bill appears to push people ever further towards decisions being taken by a computer.

I suspect that many of us will have done battle with some of the existing identity verification systems. In my own case, I can think of one bank where I gave up in deep frustration as it insisted on telling me that I was not the same person as my driving licence showed. I have also come up against systems used by estate agents when trying to provide a guarantee for my student son that was so intrusive that I, again, refused to use it.

Therefore, improving verification services is to be encouraged but there must be some element of choice, and if someone does not have the know-how, confidence, or trust in the systems, they should be able to do so through some non-digital alternative. They should not be barred from using relevant important services such as, in my examples, banking and renting a property because they cannot or would prefer not to use a digital verification service.

At the very least, even if the Minister is not minded to accept that amendment, I hope that he can make clear that the Government have no intention to make digital ID verification mandatory, as some have suggested that this Part 2 may be driving towards.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I understand that some services are purely digital, but some of those may well not have digital ID. We do not know what future services there might be, so they might want to show an analogue ID. Is my noble friend saying that that will not be possible because it will impose too much of a burden on those innovative digital companies? Could he clarify what he said?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.

Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.

Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.

I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.

I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.

It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

The noble Lord missed the start of the debate.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I apologise and thank the noble Lord for his collegiate approach.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.

Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.

I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended

“by adding or varying safeguards”.

The Delegated Powers Committee quotes the department saying that

“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”

afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.

The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.

We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.

The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.

I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.

I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.

--- Later in debate ---
It is worth stressing that this amendment does not say how the Government should do this; it simply sets out the principle that the Government should do it. It is not open to the argument that the amendment should be drafted differently or approached in another way; it simply says that we should free the postal address file. I very much hope to hear positive words and a positive direction from the Minister on Amendment 252.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.

I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.

My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.

A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.

Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.

Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.

Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.

In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.

I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.

The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.

The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:

“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.


One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.

I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.

In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.

Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.

Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.

Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.

The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.

Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:

“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.


I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.

Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.

I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.

I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I should also declare an interest. I apologise that I did not do so earlier. I worked with a think tank and wrote a series of papers on who regulates the regulators. I still have a relationship with that think tank.

Lord Bassam of Brighton Portrait Lord Bassam of Brighton (Lab)
- Hansard - - - Excerpts

My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.

On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?

On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?

I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?

Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.

The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we are beginning rather a long journey—at least, it feels a bit like that. I will speak to Amendments 1, 5 and 288, and the Clause 1 stand part notice.

I will give a little context about Clause 1. In a recent speech, the Secretary of State said something that Julia Lopez repeated this morning at a conference I was at:

“The Data Bill that I am currently steering through Parliament with my wonderful team of ministers”—


I invite the Minister to take a bow—

“is just one step in the making of this a reality—on its own it will add £10 billion to our economy and most crucially—we designed it so that the greatest benefit would be felt by small businesses across our country. Cashing in on a Brexit opportunity that only we were prepared to take, and now those rewards are going to be felt by the next generation of founders and business owners in local communities”.

In contrast, a coalition of 25 civil society organisations wrote to the Secretary of State, calling for the Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice and other organisations. On these Benches, we share the concerns about the government proposals. They will seriously weaken data protection rights in the UK and will particularly harm people from marginalised communities.

So that I do not have to acknowledge them at every stage of the Bill, I will now thank a number of organisations. I am slightly taking advantage of the fact that our speeches are not limited but will be extremely limited from Monday onwards—the Minister will have 20 minutes; I, the noble Baroness, Lady Jones, and colleagues will have 15; and Back-Benchers will have 10. I suspect we are into a new era of brevity, but I will take advantage today, believe me. I thank Bates Wells, Big Brother Watch, Defend Digital Me, the Public Law Project, Open Rights Group, Justice, medConfidential, Chris Pounder, the Data & Marketing Association, CACI, Preiskel & Co, AWO, Rights and Security International, the Advertising Association, the National AIDS Trust, Connected by Data and the British Retail Consortium. That is a fair range of organisations that see flaws in the Bill. We on these Benches agree with them and believe that it greatly weakens the existing data protection framework. Our preference, as we expressed at Second Reading, is that the Bill is either completely revised on a massive scale or withdrawn in the course of its passage through the Lords.

I will mention one thing; I do not think the Government are making any great secret of it. The noble Baroness, Lady Kidron, drew my attention to the Keeling schedule, which gives the game away, and Section 2(2). The Information Commissioner will no longer have to pay regard to certain aspects of the protection of personal data—all the words have been deleted, which is quite extraordinary. It is clear that the Bill will dilute protections around personal data processing, reducing the scope of data protected by the safeguards within the existing law. In fact, the Bill gives more power to data users and takes it away from the people the data is about.

I am particularly concerned about the provisions that change the definition of personal data and the purposes for which it can be processed. There is no need to redraft the definitions of personal data, research or the boundaries of legitimate interests. We have made it very clear over a period of time that guidance from the ICO would have been adequate in these circumstances, rather than a whole piece of primary legislation. The recitals are readily available for guidance, and the Government should have used them. More data will be processed, with fewer safeguards than currently permitted, as it will no longer meet the threshold of personal data, or it will be permitted under the new recognised legitimate interest provision, which we will debate later. That combination is a serious threat to privacy rights in the UK, and that is the context of a couple of our probing amendments to Clause 1— I will come on to the clause stand part notice.

As a result of these government changes, data in one organisation’s hands may be anonymous, while that same information in another organisation’s hands can be personal data. The factor that determines whether personal data can be reidentified is whether the appropriate organisational measures and technical safeguards exist to keep the data in question separate from the identity of specific individuals. That is a very clear decision by the CJEU; the case is SRB v EDPS, if the Minister is interested.

The ability to identify an individual indirectly with the use of additional information is due to the lack of appropriate organisational and technical measures. If the organisation had such appropriate measures that separated data into differently silos, it would not be able to use the additional information to identify such an individual. The language of technical and organisational measures is used in the definition of pseudonymisation in Clause 1(3)(d), which refers to “indirectly identifiable” information. If such measures existed, the data would be properly pseudonymised, in which case it would no longer be indirectly identifiable.

A lot of this depends on how data savvy organisations are, so those that are not well organised and do not have the right technology will get a free pass. That cannot be right, so I hope the Minister will respond to that. We need to make sure that personal data remains personal data, even if some may claim it is not.

Regarding my Amendment 5, can the Government explicitly confirm that personal data that is

“pseudonymised in part, but in which other indirect identifiers remain unaltered”

will remain personal data after this clause is passed? Can the Government also confirm that if an assessment is made that some data is not personal data, but that assessment is later shown to be incorrect, the data will have been personal data at all times and should be treated as such by controllers, processors and the Information Commissioner, about whom we will talk when we come to the relevant future clauses.

Amendment 288 simply asks the Government for an impact assessment. If they are so convinced that the definition of personal data will change, they should be prepared to submit to some kind of impact assessment after the Bill comes into effect. Those are probing amendments, and it would be useful to know whether the Government have any intention to assess what the impact of their changes to the Bill would be if they were passed. More importantly, we believe broadly that Clause 1 is not fit for purpose, and that is why we have tabled the clause stand part notice.

As we said, this change will erode people’s privacy en masse. The impacts could include more widespread use of facial recognition and an increase in data processing with minimal safeguards in the context of facial recognition, as the threshold for personal data would be met only if the data subject is on a watchlist and therefore identified. If an individual is not on a watchlist and images are deleted after checking it, the data may not be considered personal and so would not qualify for data protection obligations.

People’s information could be used to train AI without their knowledge or consent. Personal photos scraped from the internet and stored to train an algorithm would no longer be seen as personal data, as long as the controller does not recognise the individual, is not trying to identify them and will not process the data in such a way that would identify them. The police would have increased access to personal information. Police and security services will no longer have to go to court if they want access to genetic databases; they will be able to access the public’s genetic information as a matter of routine.

Personal data should be defined by what type of data it is, not by how easy it is for a third party to identify an individual from it. That is the bottom line. Replacing a stable, objective definition that grants rights to the individual with an unstable, subjective definition that determines the rights an individual has over their data according to the capabilities of the processor is illogical, complex, bad law-making. It is contrary to the very premise of data protection law, which is founded upon personal data rights. We start on the wrong foot in Clause 1, and it continues. I beg to move.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.

When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?

I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.

One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- Hansard - - - Excerpts

My Lords, anonymisation of data is crucially important in this debate. I want to see, through the Bill, a requirement for personal data, particularly medical data, to be held within trusted research environments. This is a well-developed technique and Britain is the leader. It should be a legal requirement. I am not quite sure that we have got that far in the Bill; maybe we will need to return to the issue on Report.

The extent to which pseudonymisation—I cannot say it—is possible is vastly overrated. There is a sport among data scientists of being able to spot people within generally available datasets. For example, the data available to TfL through people’s use of Oyster cards and so on tells you an immense amount of information about individuals. Medical data is particularly susceptible to this, although it is not restricted to medical data. I will cite a simple example from publicly available data.

Medical Research Techniques

Lord Kamall Excerpts
Monday 18th March 2024

(9 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

First, let me pay tribute to the work of the NC3Rs, which is an extremely important body. Nobody feels comfortable doing a lot of animal tests; they simply are necessary for human safety in too many cases. For example, UK REACH follows the last-resort principle where, as far as possible, it is able to waive animal tests for chemicals. That kind of work will further accelerate the work of the NC3Rs.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, the noble Baroness, Lady Bennett, spoke about other countries that were looking at alternatives to animal testing. What conversations has my noble friend’s department had with other countries on how they can encourage more alternatives to animal testing?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

DSIT continues to engage on life sciences research with a wide range of other countries, including countries that have tried to accelerate further. Recently, in particular, the Netherlands and the United States have not always been able to succeed in their goals of accelerating the date by which non-animal methods of research become the only way forward. On the other hand, steady progress towards the greater use of non-animal methods through the three Rs seems to be bearing fruit, albeit not as fast as anybody would like.

Combating Disinformation: Freedom of Expression

Lord Kamall Excerpts
Tuesday 13th February 2024

(10 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Well, the Government are clear, as is NSOIT, that disinformation refers to the deliberate attempt to mislead by placing falsehoods into the information environment. As part of the Civil Service, NSOIT would have robust internal measures to verify and check its own work, and indeed it reports regularly across government and to Ministers.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, can my noble friend the Minister explain what guidance is given to the unit to distinguish between disinformation and difference of opinion?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Disinformation is a deliberate falsehood. A difference of opinion is generally something of democratic importance or of journalistic or pluralistic importance, which it is very important to protect and which the Online Safety Act took very considerable measures to safeguard over its passage.

I think the department’s letter to my friends in another place admitted, even while defending the argument, that it was still a straightforward JR. I am afraid that, to me, this is not such a review. In a judicial review, if you put in the word “appropriate”, the challenge can ask whether some relevant fact has been left out, or someone has acted unreasonably or made a material error on facts. Those are, as I understand it, judicial review-type challenges. They are not a matter of saying, “You could have achieved your objective in a way that would impose fewer burdens on us”. I support the noble Lord’s amendment.
Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I have not put my name to these amendments but I want to speak in favour of Amendments 16, 17 and others in this group. After the first day of Committee, which I sat through without speaking, one noble Baroness came up to me and said I was unusually quiet—“unusually” being the key word there. When another noble Lord asked me why I sat through proceedings without saying a word, I said I had once been told about the principle that I should speak only if it improves the silence. Given the concern for my welfare shown by those two noble Members, I am about to violate that principle by making a few remarks and asking a couple of questions.

As this is the first time for me to speak in Committee, I refer noble Lords to my interests as set out in the register. These include being an unpaid member of the advisory board of Startup Coalition and a non-executive director for the Department for Business and Trade. I have also worked with a couple of think tanks and have written on regulation and competition policy, and I am a professor of politics and international relations at St Mary’s University. I mention that last role because in future interventions I will refer to some political science theories, but I assure noble Lords that I will try not to bore them. I am also a member of the Communications and Digital Committee.

I want to make only a short intervention on the amendments. Previous noble Lords made the point that we want to understand the Government’s intention behind deciding to change the word from “appropriate” to “proportionate”. I am grateful to my noble friend Lord Lansley for seeking to answer that question. I am not a lawyer, so I am very grateful to the noble Lord, Lord Faulks, for his intervention, which explained the legal context for “proportionate”. It has to be said, however, that at Second Reading I and a number of other noble Lords repeatedly asked the Minister to clarify and justify the change in wording. A satisfactory answer was not given, hence we see these amendments in Committee.

We could argue that this is an entirely appropriate response to what my noble friend said in Committee. Maybe the Government could argue that it was a proportionate response. It is a very simple question: can the Minister explain the reasons? Is it, as my noble friend Lord Lansley says, that there is something wider in “proportionate” than “appropriate”? Will the Government consider bringing forward an amendment that explains this—sort of “appropriate-plus”—to make sure that it is legally well understood? Can the Government assure us that it is not a loophole to allow more movement towards a merits appeal, as opposed to judicial review, which many of us have come to support?

I have some support for Amendment 222, in the name of my noble friend Lord Holmes, which seeks clarity on the appeal standards for financial penalties and countervailing benefits, but I know we will discuss these in a later group.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this has been a really interesting and helpful debate, with a number of noble Lords answering other noble Lords’ questions, which is always pretty useful when you are summing up at the end. One thing absolutely ties every speaker together: agreement with the letter to the Prime Minister from the noble Baroness, Lady Stowell, on behalf of her committee, about the need to retain the JR principle throughout the Bill. That is what we are striving to do.

It was extremely interesting to hear what the noble Lord, Lord Lansley, had to say. He answered the second half of the speech by the noble Lord, Lord Tyrie. I did not agree with the first half but the second was pretty good. The “whiff” that the noble Lord, Lord Tyrie, talked about was answered extremely well by the noble Lord, Lord Lansley. It was a direct hit.

The interesting aspect of all this is that the new better regulation framework that I heard the noble Lord, Lord Johnson, extolling from the heights in the Cholmondeley Room this afternoon includes a number of regulatory principles, including proportionality, but why not throw the whole kitchen sink at the Bill? Why is there proportionality in this respect? It was also really interesting to hear from the noble Lord, Lord Faulks, who unpacked very effectively the use of the proportionality principle. It looks as though there is an attempt to expand the way the principle is prayed in aid during a JR case. That seems fairly fundamental.

I hope that the Minister can give us assurance. We have a pincer movement here: there are a number of different ways of dealing with this, in amendments from the noble Lords, Lord Holmes and Lord Faulks, and the noble Baroness, Lady Stowell, but we are all aiming for the same end result. However we get there, we are all pretty determined to make sure that the word “proportionate” does not appear in the wrong place. In all the outside briefings we have had, from the Open Markets Institute, Foxglove and Which?, the language is all about unintended consequences and widening the scope of big tech firms to challenge. What the noble Lord, Lord Vaizey, had to say about stray words was pretty instructive. We do not want language in here which opens up these doors to further litigation. The debate on penalties is coming, but let us hold fast on this part of the Bill as much as we possibly can.

--- Later in debate ---
Lord Lansley Portrait Lord Lansley (Con)
- Hansard - - - Excerpts

My Lords, in my short contribution I will look at what Clause 29 adds and whether it is necessary. I suppose I am saying that I want to speak to whether Clause 29 should stand part. We might have to come back to that.

My starting point was Clause 19(10):

“Before imposing a conduct requirement … on a designated undertaking, the CMA must have regard in particular to the benefits for consumers”.


Unless I am missing something, that will include disbenefits, so the countervailing benefits form part of that consideration. I do not understand why it would not be the best drafting, or the best Explanatory Note, to say, “Under Clause 19, when the CMA is considering imposing a conduct requirement, it must have regard to any countervailing benefits of not imposing such a conduct requirement”.

That is the starting point but let us say, for the purpose of the argument, that Clause 29 is not really about the imposition of a conduct requirement in the first place but about what should happen when there is a conduct investigation. But there are more stages for the designated undertaking. When the CMA wants to impose a conduct requirement, it has to give a notice under Clause 21 and say what the benefits are. The undertaking can come along and say, “Well, we have countervailing benefits if you don’t do this”, so it is entirely open at that stage to raise the countervailing benefits clause. I do not know why it is called an exemption. It is not an exemption. There should not be an exemption from the regime; there should just be a balance: how is the consumer benefit to be maximised? Once that notice has been served, it is subject to a public consultation under Clause 24, and the undertaking can come along under Clause 24.

Let us say that all that has happened, and there is a potential breach of the conduct requirement, and the CMA initiates an investigation under Clause 26. When the CMA does that, it has to give the opportunity to make representations within a defined period. Even if the countervailing benefits have not been taken into account in the original activity, when a breach is considered the notice is issued and the undertaking can come along and say, “Well, actually, the consumer benefits are being delivered by this means, and it is necessary and indispensable”, or whatever word you use. We could include it, if necessary, in the guidance.

I do not think that we are quite finished, even then. Clause 27 requires that in the

“undertaking to which a conduct investigation relates … the CMA must consider any representations that the undertaking makes”.

We could have put it in there, because it has a right to make representations at that point.

After all these things, which get us to the point where it has been considered in the first place, considered in whether a notice of a breach should be issued, and considered in the notice for the conduct investigation, and been given the opportunity to make representations, why do we need another clause that says that there is this thing that is called a countervailing benefits exemption as distinct from, at each previous stage—and there are many of them—the benefits or disbenefits and potential consumer benefits from different requirements that are to be considered? Frankly, I do not see it—unless it is, as my noble friend said, that there is a “get out of jail free” card that can be played. If it can be played, it will be played, so I do not think that we should allow it to be played.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

My Lords, I will speak to Amendments 36, 38, 39, 40 and 41. I have been trying to understand the reason for the current government position. One issue that I have thought about, and which I have written about in the past, is the notion of unintended consequences. Often a well-intended government intervention can make things worse. Many of you will remember the example of the Government of the 1990s introducing the dash to diesel, as it was supposed to be better for the environment—and, in response, we found that actually it made things worse. That is not to criticise the Government of the day, as it was well-intentioned, and many people supported the reduction of greenhouse gases.

One thing that I have thought about with regard to better law-making is how we ensure that there are safeguards in place for when there are negative unintended consequences. For that reason, I have some sympathy for considering whether the unintended consequence of a CMA decision could make things worse for consumers. However, like many noble Lords I am concerned that this is a massive loophole for large tech companies to continue to engage in anti-competitive behaviour or, as other noble Lords have said, slow down the process.

Having looked at the amendments and the Government’s position, I want to ask my noble friend the Minister a direct question. Could he explain what the Government mean by countervailing benefits and give some real examples, or hypothetical examples, of where consumers may be harmed by a pro-competitive intervention by the CMA? If that response convinces noble Lords, perhaps the Government could consider bringing forward an amendment based on Amendment 41 from the noble Lord, Lord Clement-Jones. I look forward to my noble friend the Minister’s response.

Lord Bishop of Manchester Portrait The Lord Bishop of Manchester
- Hansard - - - Excerpts

My Lords, I shall be extremely brief. When we debate in Grand Committee, it always strikes me that we do so in the Moses Room —Moses, the great giver of the law. However, the biblical characters that I am more thinking of today would be David fighting Goliath, because it seems to be that a lot of the conversation around this group of amendments is about how we create a proper balance between the large platforms and small entrepreneurial providers. My mother was a small businesswoman; she ran two record shops in the Greater Manchester area. We could have been put out of business very easily if somebody had been able to delay some anti-competitive business action against us. We also have the judgment of Solomon here; he was quick in his judgment—there were no lengthy processes that took for ever and a day. I tend to the view that the Bill, as it entered the House of Commons, was probably at about the sweet spot, but let us get this right so that Davids have a chance amid the Goliaths. And yes, I apologise for not declaring that interest—I am called David.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I said that the purpose of Clause 19(5) is to set the parameters for the design of conduct requirements by the CMA. Its purpose is to guide the CMA, not to bind the recipients of conduct requirements.

Amendment 48 from the noble Baroness, Lady Jones of Whitchurch, would allow the final offer mechanism tool to be used earlier in the enforcement process. The final offer mechanism is a backstop tool designed to incentivise sincere negotiations about fair and reasonable payment terms between the SMS firm and third parties. It is crucial that there is room for good faith negotiation where disputes arise from sincere differences of understanding rather than deliberate non-compliance. Overly shortening the enforcement process would greatly reduce these opportunities.

We recognise, however, that some stakeholders may be concerned about SMS firms frustrating the process and refusing to comply with these conduct requirements and any subsequent enforcement. Here, the CMA could seek to accelerate the stages before the final offer mechanism, making use of urgent deadlines for compliance with enforcement orders and significant financial penalties where appropriate, ensuring that parties will also not be able to drag their feet and delay the process. In addition, interim enforcement orders can be issued on a temporary basis during a conduct investigation, before a breach has been found. They could be used to prevent significant damage, such as a company going bust, to prevent conduct that would reduce effectiveness of future remedies or to protect the public interest. Our regime aims to tackle the far-reaching power of the most powerful tech firms.

I know that my noble friend Lord Black noted the Australian legislation. Our regime contrasts the Australian legislation in that it has been designed to protect businesses and consumers across the economy including, but not limited to, news publishers. Alongside the final offer mechanism, the DMU will have other powers to tackle unfair and unreasonable payment terms via conduct requirements, ensuring that the final offer mechanism will rarely, if ever, need to be used.

Amendments 49, 50 and 51 from the noble Lord, Lord Clement-Jones, would allow parties to submit further final offers if the CMA considers that the first were not fair and reasonable. The final offer mechanism involves a binary choice between the two final offers submitted by the parties. It is the finality of the process that creates such a strong incentive for the parties to submit fair and reasonable offers. An unreasonable offer only increases the likelihood of the CMA choosing the other party’s proposal.

Introducing scope for an additional round of bidding would undermine these incentives and would only serve to delay the securing of fair and reasonable terms for the third party. As a result, we hope, for the reasons set out, that the noble Lord feels able not to press these amendments.

Finally, this group includes two government amendments, which are both minor and technical in nature, relating to Clauses 38 and 117. These amendments clarify that digital content is included in the meaning of the phrase “goods or services” when used in Part 1 of the Bill, including when mentioned under the final offer mechanism. I hope that noble Lords will support these amendments.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - -

I apologise—I should have maybe intervened earlier but I did not want to join the barrage, as it were. When my noble friend the Minister writes to us, as he inevitably will, I wonder whether he can help us to understand the Government’s position on countervailing benefits by outlining what they really mean by that and giving some real or hypothetical examples of where consumers may be harmed by a pro-competitive intervention by the CMA.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, indeed. I thank my noble friend for repeating the question and I apologise that I did not get to it earlier. I would be delighted to write and provide such examples.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the previous speakers, including my noble friend the Minister, the other Front-Benchers and the noble Baroness, Lady Kidron.

I start by thanking the House of Lords Library for its briefing—it was excellent, as usual—and the number of organisations that wrote to noble Lords so that we could understand and drill down into some of the difficulties and trade-offs we are going to have to look at. As with most legislation, we want to get the balance right between, for example, a wonderful environment for commerce and the right to privacy and security. I think that we in this House will be able to tease out some of those issues and, I hope, get a more appropriate balance.

I refer noble Lords to my interests as set out in the register. They include the fact that I am an unpaid adviser to the Startup Coalition and have worked with a number of think tanks that have written about tech and privacy issues in the past.

When I look at the Bill at this stage, I think that there are bits to be welcomed, bits that need to be clarified and bits that raise concern. I want to touch on a few of them before drilling down—I will not drill down into all of them, because I am sure that noble Lords have spoken or will speak on them, and we will have much opportunity for further debate.

I welcome Clause 129, which requires social media companies to retain information linked to a child suicide. However, I understand and share the concern of the noble Baroness, Lady Kidron, that this seems to be the breaking of a promise. The fact is that this was supposed to be about much more data and harms to children and how we can protect our children. In some ways, we must remember the analogy about online spaces: when we were younger, before the online age, our parents were always concerned about us when we went beyond the garden gate; nowadays, we must look at the internet and the computers on our mobile devices as that garden gate. When children leave that virtual garden gate and go through into the online world, we must ask whether they are safe, in the same way that my parents worried about us when, as children, we went through our garden gate to go out and play with others.

Clauses 138 to 141, on a national underground asset register, are obviously very sensible; that proposal is probably long overdue. I have questions about the open electoral register, in particular the impact on the direct marketing industry. Once again, we want to get the balance right between commerce and ease of doing business, as my noble friend the Minister said, and the right to privacy.

I have concerns about Clauses 147 and 148 on abolishing the offices of the Biometrics Commissioner and the Surveillance Camera Commissioner. I understand that the responsibilities will be transferred, but, in thinking about the legislation that we have been talking about in this place—such as the Online Safety Act—I wonder about the amount of powers that we are giving to these regulators and whether they will have the bandwidth for them. Is there really a good reason for abolishing these two commissioners?

I share the concerns of the noble Lord, Lord Knight, about access to bank accounts. Surely people should have the right to know why their bank account has been accessed and have some protection so that not just anyone can access it. I know that it is not just anyone but there are concerns about this, and people have to be clearer on the rules.

I have talked to the direct marketing industry. It sees the open electoral register as a valuable resource for businesses in understanding and targeting customers. However, it tells me that a recent court case between Experian and the ICO has introduced some confusion on the use of the register for business purposes. It is concerned that the Information Commissioner’s Office’s interpretation, requiring notification to every individual for every issue, presents challenges that could cost the industry millions and make the open electoral register unusable for it, perhaps pushing businesses to rely more on large tech companies. However, I understand that, at the same time, this may well be an issue where there are clear concerns about privacy.

Where there is no harm, I would like to understand the Government’s thinking on some of that—whether it is going too far or whether some clarification is needed in this area. Companies say they will be unable to target prospective customers; some of us may like that, but we should also remember that there is Clause 116 on unlawful direct marketing. The concern for many of us is that while it is junk if we do not want it, sometimes we do respond to someone’s direct marketing. I wonder how we get that balance right; I hope we can tease some of that out. If the Government agree with the interpretation and restrictions on the direct marketing industry, I wonder whether they can explain some of the reasons behind it. There may very well be good reasons.

I also want to look at transparency and data usage, not just for AI but more generally. It is obvious in the Government’s own AI White Paper that we want a pro-innovation approach to regulation, but we are also calling for transparency at a number of levels: of datasets and of algorithms. To be honest, even if we are given that transparency, do we have the ability to understand those algorithms and datasets? We still need that transparency. I am concerned about undermining the principle, and particularly weakening subject access requests.

I am also interested in companies that, say, have used your data but have refused an application and then tell you that they do not have to tell you why they refused that application. Perhaps this is too much of a burden to companies, but I wonder whether we have a right to know which data was being accessed when that decision was made. I will give a personal example; about a year ago, I applied for an account with a very clever online bank and was rejected. It told me I would have a decision within 48 hours; I did not. Two weeks later, I got a message on the app that said I had been rejected and that under the law it did not have to tell me why. I wrote to it and said, “Okay, you don’t have to tell me why, but could you delete all the data you have on me—what I put in?”. It said, “Oh, we don’t have to delete it until a certain time”. If we really own that data, I wonder whether there should be more of an expectation on companies to explain what data and information they have to make those decisions, which can be life changing for many people. We have heard all sorts of stories about access to bank accounts and concerns about digital exclusion.

We really have to think about how much access individuals can have to the data that is used to refuse them, but also the data when they leave a service or stop being a user. I also want to make sure that there is accountability. I want to know, in Clause 12, about “reasonable and proportionate search”; what does that mean, particularly when it is processed by law enforcement and intelligence services? I think we need further clarification on some of this for our assurance.

We also have to recognise that, if we look at the online environment of the last 10, 15 or 20 years, at first we were very happy to give our data away to social media companies because we thought we were getting a free service, connecting with friends across the world et cetera. Only later did we realise that the companies were using this data and monetising it for commercial purposes. There is nothing wrong with that in itself, but we have to ask whose data it is. Is it my data? Does the company own it? For those companies that think they own it, why do they think that? We need some more accountability, to make sure that we understand which data we own and which we give away. Once again, the same thing might happen—you might stop being a user or customer of a service, or you might be rejected, but it is not there.

As an academic, I recognise the need for greater access to data, particularly for online research. I welcome some of the mechanisms in the Online Safety Act that we debated. Does my noble friend the Minister believe that the Bill sufficiently addresses the requirements and incentives for large data holders to hold data for academic research with all the appropriate safeguards in place? I wonder whether the Minister has looked at some of the proposals to allow this to happen more, perhaps with the information commission acting as an intermediary for datasets et cetera. Once again, I am concerned about giving even more power to the information commission and the bandwidth to do all this stuff, including all the powers we are giving.

On cookie consent, I understand the annoyance of cookies. I remember the debates about cookie consent when I was in the European Parliament, but at the time we supported it because we thought it was important for users to be told what was being done with their information. It has become annoying, just like those text messages when we go roaming; I supported that during the roaming debates in the European Parliament because I did not want users to say they were not warned about the cost of roaming. The problem is that they become annoying; people ignore them and tick things on terms and conditions without having read them because they are too long.

When it comes to some of the cookies, I like the idea about exemptions for prior consent—a certain opt-out where there is no real harm—but I wonder whether it could be extended, for example so that cookies to understand the performance of advertising and to help companies understand the effectiveness of advertisements are exempt from the consent requirements. I do not think this would fundamentally change the structure of the Bill, but I wonder whether we have the right balance here on harm, safety and the ability of companies to test the effectiveness of some of their direct marketing. Again, I am just interested in the Government’s thinking about the balance between privacy and commerce.

Like other noble Lords, I share concerns about the powers granted to the Secretary of State. I think they lack the necessary scrutiny and safeguards, and that there is a risk of undermining the operations of online content and service providers that rely on these technologies. We need to see some strengthening here and more assurances.

I have one or two other concerns. The Information Commissioner has powers to require people to attend interviews as part of an investigation; that seems rather Big Brother-ish to me, and I am not sure whether the Information Commissioner would want these abilities, but there might be good reasons. I just want to understand the Government’s thinking on this.

I know that on Report in the other place, both Dawn Butler MP and David Davis MP raised concerns about retaining the right to use non-digital verification systems. We all welcome verification systems, but the committee I sit on—the Communications and Digital Committee—recently wrote a report on digital exclusion. We are increasingly concerned about digital exclusion and people having a different level of service because they are digitally excluded. I wonder what additional assurances the Minister can give us on some of those issues. The Minister in the other place said:

“Individual choice is integral … digital verification services can be provided only at the request of the individual”.—[Official Report, Commons, 29/11/23; col. 913.]


I think that any further verification would be really important.

The last point I turn to is EU adequacy. Let me be quite clear: I do not believe in divergence for the sake of divergence, but at the same time I do not believe in convergence or harmonisation for the sake of convergence and harmonisation. We used to have these debates in the European Parliament all the time. There are those expressing concerns about EU data adequacy, and we have to split them into two groups—one is those people who really still wish we were members of the EU, but there are also those for whom this is irrelevant, and for whom this really is about the privacy and security of our users. If the EU is raising these issues in its agreements, we can thank it for doing that.

I obviously was involved in debates on the safe harbour and the privacy shield. As noble Lords have said, we thought we had the right answer; the Commission thought we had the answer, but it was challenged by courts. I think this will have to be challenged more. Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?

I end by saying that I look forward to the maiden speech of the noble Lord, Lord de Clifford. I thank noble Lords for listening to me, and I look forward to working with noble Lords across the House on some of the issues I have raised.

Advanced Research and Innovation Agency

Lord Kamall Excerpts
Thursday 29th June 2023

(1 year, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Indeed—again, the point is well taken. We cannot have these types of organisations existing in separate universes and not talking to each other. It is crucial that they exploit their complementarity in this way.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, we are all very supportive of ARIA, but the important issue is the innovation principle and embedding that principle across government in all departments. Defra published five environmental principles—integration, prevention, rectification, polluter pays and precautionary—but there was no innovation principle. It is essential that we see the innovation principle right across government.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Indeed. As set out in the ARIA Act, ARIA is required to observe three principles that come under the broad heading of innovation: contributing to the economic growth of the UK; promoting scientific innovation in the UK; and improving quality of life of everyone in the UK.