Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Harding of Winscombe
Main Page: Baroness Harding of Winscombe (Conservative - Life peer)Department Debates - View all Baroness Harding of Winscombe's debates with the Department for Science, Innovation & Technology
(7 months, 4 weeks ago)
Grand CommitteeMy Lords, as is so often the case on these issues, it is daunting to follow the noble Baroness as she has addressed the issues so comprehensively. I speak in support of Amendment 57, to which I have added my name, and register my support for my noble friend Lord Holmes’s Amendment 59A, but I will begin by talking about the Clause 14 stand part notice.
Unfortunately, I was not able to stay for the end of our previous Committee session so I missed the last group on automated decision-making; I apologise if I cover ground that the Committee has already covered. It is important to start by saying clearly that I am in favour of automated decision-making and the benefits that it will bring to society in the round. I see from all the nodding heads that we are all in the same place—interestingly, my Whip is shaking his head. We are trying to make sure that automated decision-making is a force for good and to recognise that anything involving human beings—even automated decision-making does, because human beings create it—has the potential for harm as well. Creating the right guard-rails is really important.
Like the noble Baroness, Lady Kidron, until I understood the Bill a bit better, I mistakenly thought that the Government’s position was not to regulate AI. But that is exactly what we are doing in the Bill, in the sense that we are loosening regulation and the ability to make use of automated decision-making. While that may be the right answer, I do not think we have thought about it in enough depth or scrutinised it in enough detail. There are so few of us here; I do not think we quite realise the scale of the impact of this Bill and this clause.
I too feel that the clause should be removed from the Bill—not because it might not ultimately be the right answer but because this is something that society needs to debate fully and comprehensively, rather than it sneaking into a Bill that not enough people, either in this House or the other place, have really scrutinised.
I assume I am going to lose that argument, so I will briefly talk about Amendment 57. Even if the Government remain firm that there is “nothing to see here” in Clause 14, we know that automated decision-making can do irreparable harm to children. Any of us who has worked on child internet safety—most of us have worked on it for at least a decade—regret that we failed to get in greater protections earlier. We know of the harm done to children because there have not been the right guard-rails in the digital world. We must have debated together for hours and hours why the harms in the algorithms of social media were not expressly set out in the Online Safety Act. This is the same debate.
It is really clear to me that it should not be possible to amend the use of automated decision-making to in any way reduce protections for children. Those protections have been hard fought and ensure a higher bar for children’s data. This is a classic example of where the Bill reduces that, unless we are absolutely explicit. If we are unable to persuade the Government to remove Clause 14, it is essential that the Bill is explicit that the Secretary of State does not have the power to reduce data protection for children.
My Lords, I speak in favour of the clause stand part notice in my name and that of the noble Lord, Lord Clement-Jones.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?
Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.
My Lords, I apologise for not being here on Monday, when I wanted to speak about automated decision-making. I was not sure which group to speak on today; I am thankful that my noble friend Lord Harlech intervened to ensure that I spoke on this group and made my choice much easier.
I want to speak on Amendments 74 to 77 because transparency is essential. However, one of the challenges about transparency is to ensure you understand what you are reading. I will give noble Lords a quick example: when I was in the Department of Health and Social Care, we had a scheme called the voluntary pricing mechanism for medicines. Companies would ask whether that could be changed and there could be a different relationship because they felt that they were not getting enough value from it. I said to the responsible person in the department, “I did engineering and maths, so can you send me a copy of algorithm?” He sent it to me, and it was 100 pages long. I said, “Does anyone understand this algorithm?”, and he said, “Oh yes, the analysts do”. I was about to get a meeting, but then I was moved to another department. That shows that even if we ask for transparency, we have to make sure that we understand what we are being given. As the noble Lord, Lord Clement-Jones, has worded this, we have to make sure that we understand the functionality and what it does at a high enough level.
My noble friend Lady Harding often illustrates her points well with short stories. I am going to do that briefly with two very short stories. I promise to keep well within the time limit.
A few years ago, I was on my way to a fly to Strasbourg because I was a Member of the European Parliament. My train got stuck, and I missed my flight. My staff booked me a new ticket and sent me the boarding pass. I got to the airport, which was fantastic, and got through the gate and was waiting for my flight in a waiting area. They called to start boarding and, when I went to go on, they scanned my pass again and I was denied boarding. I asked why I was denied, having been let into the gate area in the first place, but no one could explain why. To cut a long story short, over two hours, four or five people from that company gaslighted me. Eventually, when I got back to the check-in desk, which the technology was supposed to avoid in the first place, it was explained that they had sent me an email the day before. In fact, they had not sent me an email the day before, which they admitted the day after, but no one ever explained why I was not allowed on that flight.
Imagine that in the public sector. I can accept it, although it was awful behaviour by that company, but imagine that happening for a critical operation that had been automated to cut down on paperwork. Imagine turning up for your operation when you are supposed to scan your barcode to be let into the operating theatre. What happens if there is no accountability or transparency in that case? This is why the amendments tabled by the noble Lord, Lord Clement-Jones, are essential.
Here is another quick story. A few years ago, someone asked me whether I was going to apply for one of these new fintech banks. I submitted the application and the bank said that it would get back to me within 48 hours. It did not. Two weeks later, I got a message on the app saying that I had been rejected, that I would not be given an account and that “by law, we do not have to explain why”.
Can you imagine that same technology being used in the public sector, with a WYSIWYG on the fantastic NHS app that we have now? Imagine booking an appointment then suddenly getting a message back saying, “Your appointment has been denied but we do not have to explain why”. These Amendments 74 to 78 must be given due consideration by the Government because it is absolutely essential that citizens have full transparency on decisions made through automated decision-making. We should not allow the sort of technology that was used by easyJet and Monzo in this case to permeate the public sector. We need more transparency—it is absolutely essential—which is why I support the amendments in the name of the noble Lord, Lord Clement-Jones.
My Lords, I associate myself with the comments that my noble friend Lord Kamall just made. I have nothing to add on those amendments, as he eloquently set out why they are so important.
In the spirit of transparency, my intervention enables me to point out, were there any doubt, who I am as opposed to the noble Baroness, Lady Bennett, who was not here earlier but who I was mistaken for. Obviously, we are not graced with the presence of my noble friend Lord Maude, but I am sure that we all know what he looks like as well.
I will speak to two amendments. The first is Amendment 144, to which I have added my name. As usual, the noble Baroness, Lady Kidron, has said almost everything that can be said on this but I want to amplify two things. I have yet to meet a politician who does not get excited about the two-letter acronym that is AI. The favoured statement is that it is as big a change in the world as the discovery of electricity or the invention of the wheel. If it is that big—pretty much everyone in the world who has looked at it probably thinks it is—we need properly to think about the pluses and the minuses of the applications of AI for children.
The noble Baroness, Lady Kidron, set out really clearly why children are different. I do not want to repeat that, but children are different and need different protections; this has been established in the physical world for a very long time. With this new technology that is so much bigger than the advent of electricity and the creation of the first automated factories, it is self-evident that we need to set out how to protect children in that world. The question then is: do we need a separate code of practice on children and AI? Or, as the noble Baroness set out, is this an opportunity for my noble friend the Minister to confirm that we should write into this Bill, with clarity, an updated age-appropriate design code that recognises the existence of AI and all that it could bring? I am indifferent on those two options but I feel strongly that, as we have now said on multiple groups, we cannot just rely on the wording in a previous Act, which this Bill aims to update, without recognising that, at the same time, we need to update what an age-appropriate design code looks like in the age of AI.
The second amendment that I speak to is Amendment 252, on the open address file. I will not bore noble Lords with my endless stories about the use of the address file during Covid, but I lived through and experienced the challenges of this. I highlight an important phrase in the amendment. Proposed new subsection (1) says:
“The Secretary of State must regularly publish a list of UK addresses as open data to an approved data standard”.
One reason why it is a problem for this address data to be held by an independent private company is that the quality of the data is not good enough. That is a real problem if you are trying to deliver a national service, whether in the public sector or the private sector. If the data quality is not good enough, it leaves us substantially poorer as a country. This is a fundamental asset for the country and a fundamental building block of our geolocation data, as the noble Lord, Lord Clement-Jones, set out. Anybody who has tried to build a service that delivers things to human beings in the physical world knows that errors in the database can cause huge problems. It might not feel like a huge problem if it concerns your latest Amazon delivery but, if it concerns the urgent dispatch of an ambulance, it is life and death. Maintaining the accuracy of the data and holding it close as a national asset is therefore hugely important, which is why I lend my support to this amendment.
My Lords, the noble Lord, Lord Clement-Jones, has, as ever, ably introduced his Amendments 74, 75, 76, 77 and 78, to the first of which the Labour Benches have added our name. We broadly support all the amendments, but in particular Amendment 74. We also support Amendment 144 which was tabled by the noble Baroness, Lady Kidron, and cosigned by the noble Baroness, Lady Harding, the noble Lord, Lord Clement-Jones and my noble friend Lady Jones.
Amendments 74 to 78 cover the use of the Government’s Algorithmic Transparency Recording Standard—ATRS. We heard a fair bit about this in Committee on Monday, when the Minister prayed it in aid during debates on Clause 14 and Article 22A. The noble Lord, Lord Clement-Jones, outlined its valuable work, which I think everyone in the Committee wants to encourage and see writ large. These amendments seek to aid the transparency that the Minister referred to by publishing reports by public bodies using algorithmic tools where they have a significant influence on the decision-making process. The amendments also seek to oblige the Secretary of State to ensure that public bodies, government departments and contractors using public data have a compulsory transparency reporting scheme in place. The amendments legislate to create impact assessments and root ADM processes in public service that minimise harm and are fair and non-discriminatory in their effect.
The noble Lord, Lord Kamall, made some valuable points about the importance of transparency. His two stories were very telling. It is only right that we have that transparency for the public service and in privately provided services. I think the Minister would be well advised to listen to him.
The noble Lord, Lord Clement-Jones, also alighted on the need for government departments to publish reports under the ATRS in line with their position as set out in the AI regulation White Paper consultation process and response. This would put it on a legislative basis, and I think that is fairly argued. The amendments would in effect create a statutory framework for transparency in the public service use of algorithmic tools.
We see these amendments as forming part of the architecture needed to begin building a place of trust around the increased use of ADM and the introduction of AI into public services. Like the Government and everyone in this Committee, we see all the advantages, but take the view that we need to take the public with us on this journey. If we do not do that, we act at our peril. Transparency, openness and accountability are key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future.
We also support Amendment 144 in the name of the noble Baroness, Lady Kidron, for the very simple reason that in the development of AI technology we should hardwire into practice and procedure using the technology as it affects the interests of children to higher standards, and those higher standards should apply. This has been a constant theme in our Committee deliberations and our approach to child protection. In her earlier speech, the noble Baroness, Lady Harding, passionately argued for the need to get this right. We have been wanting over the past decade in that regard, and now is the moment to put that right and begin to move on this policy area.
The noble Baroness, Lady Kidron, has made the argument for higher standards of protection for children persuasively during all our deliberations, and a code of practice makes good sense. As the noble Baroness, Lady Harding, said, it can either be stand-alone or integrated. In the end, it matters little, but having it there setting the standard is critical to getting this policy area in the right place. The amendment sets out the detail that the commissioner must cover with admirable clarity so that data processors should always have prioritising children’s interests and fundamental rights in their thinking. I am sure that is something that is broadly supported by the whole Committee.
My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.
I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.
Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.
The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.
The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.
On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.
There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.
Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.
The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.
I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.
At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.
Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:
“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.
However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.
My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.
The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.
One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.