Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateViscount Camrose
Main Page: Viscount Camrose (Conservative - Excepted Hereditary)Department Debates - View all Viscount Camrose's debates with the Department for Science, Innovation & Technology
(8 months, 3 weeks ago)
Grand CommitteeMy Lords, I rise to speak to Amendments 11, 12, 13, 14, 15, 16, 17 and 18 and to whether Clauses 5 and 7 should stand part of the Bill. In doing so, I thank the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Jones and Lady Kidron, for their amendments. The amendments in the group, as we have heard, relate to Clauses 5 and 7, which make some important changes to Article 6 of the UK GDPR on the lawfulness of processing.
The first amendment in the group, Amendment 11, would create a new lawful ground, under Article 6(1) of UK GDPR, to enable the use of personal data published by public bodies with a person’s consent and to enable processing by public bodies for the benefit of the wider public. The Government do not believe it would be necessary to create additional lawful grounds for processing in these circumstances. The collection and publication of information on public databases, such as the list of company directors published by Companies House, should already be permitted by existing lawful grounds under either Article 6(1)(c), in the case of a legal requirement to publish information, or Article 6(1)(e) in the case of a power.
Personal data published by public bodies can already be processed by other non-public body controllers where their legitimate interests outweigh the rights and interests of data subjects. However, they must comply with their requirements in relation to that personal data, including requirements to process personal data fairly and transparently. I am grateful to the noble Lord, Lord Clement-Jones, for setting out where he thinks the gaps are, but I hope he will accept my reassurances that it should already be possible under the existing legislation and will agree to withdraw the amendment.
On Clause 5, the main objectives introduce a new lawful ground under Article 6(1) of the UK GDPR, known as “recognised legitimate interests”. It also introduces a new annexe to the UK GDPR, in Schedule 1 to the Bill, that sets out an exhaustive list of processing activities that may be undertaken by data controllers under this new lawful ground. If an activity appears on the list, processing may take place without a person’s consent and without balancing the controller’s interests against the rights and interests of the individual: the so-called legitimate interests balancing test.
The activities in the annexe are all of a public interest nature, for example, processing of data where necessary to prevent crime, safeguarding national security, protecting children, responding to emergencies or promoting democratic engagement. They also include situations where a public body requests a non-public body to share personal data with it to help deliver a public task sanctioned by law.
The clause was introduced as a result of stakeholders’ concerns raised in response to the public consultation Data: A New Direction in 2021. Some informed us that they were worried about the legal consequences of getting the balancing test in Article 6(1)(f) wrong. Others said that undertaking the balancing test can lead to delays in some important processing activities taking place.
As noble Lords will be aware, many data controllers have important roles in supporting activities that have a public interest nature. It is vital that data is shared without delay where necessary in areas such as safeguarding, prevention of crime and responding to emergencies. Of course, controllers who share data while relying on this new lawful ground would still have to comply with wider requirements of data protection legislation where relevant, such as data protection principles which ensure that the data is used fairly, lawfully and transparently, and is collected and used for specific purposes.
In addition to creating a new lawful ground of recognised legitimate interests, Clause 5 also clarifies the types of processing activities that may be permitted under the existing legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR. Even if a processing activity does not appear on the new list of recognised legitimate interests, data controllers may still have grounds for processing people’s data without consent if their interests in processing the data are not outweighed by the rights and freedoms that people have in relation to privacy. Clause 5(9) and (10) makes it clear this might be the case in relation to many common commercial activities, such as intragroup transfers.
My Lords, may I just revisit that with the Minister? I fear that he is going to move on to another subject. The Delegated Powers Committee said that it thought that the Government had not provided strong enough reasons for needing this power. The public interest list being proposed, which the Minister outlined, is quite broad, so it is hard to imagine the Government wanting something not already listed. I therefore return to what the committee said. Normally, noble Lords like to listen to recommendations from such committees. There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.
I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government
“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]
Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.
Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.
Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.
All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.
My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.
Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.
But what is the test of necessity at that age?
The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.
As I recall, the ages are on the electoral register.
I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.
May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.
I would of course be very happy to continue to engage with the Electoral Commission.
We will continue to work with the ICO to make sure that it is familiar with the plans for commencement and that its plans for guidance fit into that. In parts of the UK where the voting age is 18 and the age of attainment is 16, it would be more difficult for candidates and parties to show that it was necessary or proportionate to process the personal data of 14 and 15 year-olds in reliance on the new lawful ground. In this context, creating an arbitrary distinction between children at or approaching voting age and adults may not be appropriate; in particular, many teenagers approaching voting age may be more politically engaged than some adults. These measures will give parties and candidates a clear lawful ground for engaging them in the process. Accepting this amendment would remove the benefits of greater ease of identification of a lawful ground for processing by elected representatives, candidates and registered political parties, which is designed to improve engagement with the electorate. I therefore hope that the noble Baroness, Lady Jones, will withdraw her amendment.
I now come to the clause stand part notice that would remove Clause 114, which gives the Secretary of State a power to make exceptions to the direct marketing rules for communications sent for the purposes of democratic engagement. As Clause 115 defines terms for the purposes of Clause 114, the noble Baroness, Lady Jones, is also seeking for that clause to be removed. Under the current law, many of the rules applying to electronic communications sent for commercial marketing apply to messages sent by registered political parties, elected representatives and others for the purposes of democratic engagement. It is conceivable that, after considering the risks and benefits, a future Government might want to treat communications sent for the purposes of democratic engagement differently from commercial marketing. For example, in areas where voter turnout is particularly low or there is a need to increase engagement with the electoral process, a future Government might decide that the direct marketing rules should be modified. This clause stand part notice would remove that option.
We have incorporated several safeguards that must be met prior to regulations being laid under this clause. They include the Secretary of State having specific regard to the effect the exceptions could have on an individual’s privacy; a requirement to consult the Information Commissioner and other interested parties, as the Secretary of State considers appropriate; and the regulations being subject to parliamentary approval via the affirmative procedure.
For these reasons, I hope that the noble Baroness will agree to withdraw or not press her amendments.
My Lords, I am pleased that I have sparked such a lively debate. When I tabled these amendments, it was only me and the noble Lord, Lord Clement-Jones, so I thought, “This could be a bit sad, really”, but it has not been. Actually, it has been an excellent debate and we have identified some really good issues.
As a number of noble Lords said, the expression “democratic engagement” is weasel words: what is not to like about democratic engagement? We all like it. Only when you drill down into the proposals do you realise the traps that could befall us. As noble Lords and the noble Baroness, Lady Bennett, rightly said, we have to see this in the context of some of the other moves the Government are pursuing in trying to skew the electoral rules in their favour. I am not convinced that this is as saintly as the Government are trying to pretend.
The noble Baroness, Lady Harding, is absolutely right: this is about trust. It is about us setting an example. Of all the things we can do on data protection that we have control over, we could at least show the electorate how things could be done, so that they realise that we, as politicians, understand how precious their data is and that we do not want to misuse it.
I hope we have all knocked on doors, and I must say that I have never had a problem engaging with the electorate, and actually they have never had a problem engaging with us. This is not filling a gap that anybody has identified. We are all out there and finding ways of communicating that, by and large, I would say the electorate finds perfectly acceptable. People talk to us, and they get the briefings through the door. That is what they expect an election campaign to be about. They do not expect, as the noble Baroness, Lady Harding, said, to go to see their MP about one thing and then suddenly find that they are being sent information about something completely different or that assumptions are being made about them which were never the intention when they gave the information in the first place. I just feel that there is something slightly seedy about all this. I am sorry that the Minister did not pick up a little more on our concerns about all this.
There are some practical things that I think it was helpful for us to have talked about, such as the Electoral Commission. I do not think that it has been involved up to now. I would like to know in more detail what its views are on all this. It is also important that we come back to the Information Commissioner and check in more detail what his view is on all this. It would be nice to have guidance, but I do not think that that will be enough to satisfy us in terms of how we proceed with these amendments.
The Minister ultimately has not explained why this has been introduced at this late stage. He is talking about this as though conceivably, in the future, a Government might want to adopt these rules. If that is the case, I respectfully say that we should come back at that time with a proper set of proposals that go right through the democratic process that we have here in Parliament, scrutinise it properly and make a decision then, rather than being bounced into something at a very late stage.
I have to say that I am deeply unhappy at what the Minister has said. I will obviously look at Hansard, but I may well want to return to this.
My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.
I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.
In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.
Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.
Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.
Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.
The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.
Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.
My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:
“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.
I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.
Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.
I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.
My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.
On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?
On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?
I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?
Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.
I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.
I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?
The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.
The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.
I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.
A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.
There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.
Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as
“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.
The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.
Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.
In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:
“The new parameters are not intended to be reasons for refusal”,
but rather to give
“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]
But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.
We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a
“reasonable and proportionate search for the personal data and other information”.
This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.
Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate
“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.
We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.
We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.
As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.
I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?
First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.
Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.
But the two terms have a different legal meaning, surely.
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.
I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.
It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.
The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.
I am not quite sure what is being requested because the impact assessment has been not only made but published.
I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.
That begs the question of where on earth the rest is coming from.
Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.
I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.
Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.
On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—
My apologies. The Minister just said that the Government do not collect the data. Therefore, what is the basis for changing the threshold? No data, no reasonable case.
The Government do not collect details of private interactions between those raising SARs and the companies they raise them with. The business case is based on extensive consultation—
I hope that the Government have some data about government departments and the public bodies over which they have influence. Can he provide us with a glimpse of how many requests are received, how many are rejected at the outset, how many go to the commissioners, what the cost is and how the cost is computed? At the moment, it sounds like the Government want to lower the threshold without any justification.
As I say, I do not accept that the threshold is being lowered. On the other hand, I will undertake to find out what information can be reasonably provided. Again, as I said, the independent regulatory committee gave the business case set out a green rating; that is a high standard and gives credibility to the business case calculations, which I will share.
The reforms keep reasonable requests free of charge and instead seek to ensure that controllers can refuse or charge a reasonable fee for requests that are “vexatious or excessive”, which can consume a significant amount of time and resources. However, the scope of the current provision is unclear and, as I said, there are a variety of circumstances where controllers would benefit from being able confidently to refuse or charge the fee.
The Minister used the phrase “reasonable fee”. Can he provide some clues on that, especially for the people who may request information? We have around 17.8 million individuals living on less than £12,570. So, from what perspective is the fee reasonable and how is it determined?
“Reasonable” would be set out in the guidance to be created by the ICO but it would need to reflect the costs and affordability. The right of access remains of paramount importance in the data protection framework.
Lastly, as I said before on EU data adequacy, the Government maintain an ongoing dialogue with the EU and believe that our reforms are compatible with maintaining our data adequacy decisions.
For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore agree to withdraw or not press them.
My Lords, I can also be relatively brief. I thank all noble Lords who have spoken and the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, for their amendments, to many of which I have added my name.
At the heart of this debate is what constitutes a disproportionate or impossibility exemption for providing data to individuals when the data is not collected directly from data subjects. Amendments 29 to 33 provide further clarity on how exemptions on the grounds of disproportionate effort should be interpreted —for example, by taking into account whether there would be a limited impact on individuals, whether they would be caused any distress, what the exemptions were in the first place and whether the information had been made publicly available by a public body. All these provide some helpful context, which I hope the Minister will take on board.
I have also added my name to Amendments 27 and 28 from the noble Baroness, Lady Harding. They address the particular concerns about those using the open electoral register for direct marketing purposes. As the noble Baroness explained, the need for this amendment arises from the legal ruling that companies using the OER must first notify individuals at their postal addresses whenever their data is being used. As has been said, given that individuals already have an opt-out when they register on the electoral roll, it would seem unnecessary and impractical for companies using the register to follow up with individuals each time they want to access their data. These amendments seek to close that loophole and return the arrangements back to the previous incarnation, which seemed to work well.
All the amendments provide useful forms of words but, as the noble Baroness, Lady Harding, said, if the wording is not quite right, we hope that the Minister will help us to craft something that is right and that solves the problem. I hope that he agrees that there is a useful job of work to be done on this and that he provides some guidance on how to go about it.
I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.
I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.
I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.
We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.
These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.
The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.
The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.
I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?
On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.
My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.
Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.
On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.
Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.
The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.
Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.
My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.
I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.
As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.
The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.
The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.
My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?
That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.
The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.
I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.
Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.
Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.
I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.
The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.
The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.
However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.
At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.
Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.
Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.
Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.
It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.
In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.
Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.
Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.
Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.
I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?
I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.
So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.
I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.
The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.
I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.