(1 week, 2 days ago)
Grand CommitteeMy Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.
Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:
“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.
The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,
“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.
The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:
“Parents have a prior right to choose the kind of education that shall be given to their children.”
A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.
Obligations specific to children’s data, especially
“solely automated decision-making and profiling”
and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.
The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that
“children have the right to be heard and participate in decisions affecting them”.
They recognise that
“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”
Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.
Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:
“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”
A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.
Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.
Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.
I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.
My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.
Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.
Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.
Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.
Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.
Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.
A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.
My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.
I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.
I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.
There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.
I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.
I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.
My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.
However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that
“we have good reason to be concerned”,
particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.
There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.
My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.
We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.
The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.
The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.
Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.
These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.
Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?
It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.
My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.
As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.
I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,
“a catalytic effect on innovation”
within the UK’s cybersecurity sector, which possesses “considerable growth potential”.
My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.
My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.
Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?
Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.
Yes, the Government accepted the recommendations in full.
Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.
My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.
There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.
My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.
I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.
I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.
It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”
This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.
My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.
I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.
This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.
Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.
How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.
The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.
Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.
The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?
I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.
I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.
My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.
My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.
All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.
We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.
My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.
However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.
From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.
My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.
My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.
It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.
My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.
We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?
Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.
My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.
I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?
Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.
I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.
I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.
Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.
My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.
My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.
Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?
Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.
Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.
My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.
My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.
Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.
The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.
With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.
(1 week, 4 days ago)
Grand CommitteeMy Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.
The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:
“Where processing is conducted by a private body, that processing will not usually engage convention rights”.
As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.
We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:
“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.
Indeed, she put forward an amendment:
“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]
I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.
Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.
The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.
The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.
The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.
Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.
As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on
“a reasonable and proportionate search”.
The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.
Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,
“including to equality and the environment”,
when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.
Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.
As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.
My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.
We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.
I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.
Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.
The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:
“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.
The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?
I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.
My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.
As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.
My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?
My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.
I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.
Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.
I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.
Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of
“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”
without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.
My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.
The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.
I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.
Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.
Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.
Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.
This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.
Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.
This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.
My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?
Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:
“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,
based on analysis of 13.1 million donors by the Salocin Group. The letter continues:
“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.
I hope that the Government will listen to the DMA and the charities involved.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
When does the Minister anticipate that the ICO will produce that report?
I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.
May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.
Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.
I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.
Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.
The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.
Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.
As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.
My Lords, in moving Amendment 108, I will also speak to all the other amendments in this group. They are all designed to transfer all existing provisions from the courts to the tribunals and simplify the enforcement of data rights. Is that not something to be desired? This is not just a procedural change but a necessary reform to ensure that the rights granted on paper translate into enforceable rights in reality.
The motivation for these amendments stems from recurring issues highlighted in cases such as Killock and Veale v the Information Commissioner, and Delo v the Information Commissioner. These cases revealed a troubling scenario where the commissioner presented contradictory positions across different levels of the judiciary, exacerbating the confusion and undermining the credibility of the regulatory framework governing data protection. In these cases, the courts have consistently pointed out the confusing division of jurisdiction between different courts and tribunals, which not only complicates the legal process but wastes considerable public resources. As it stands, individuals often face the daunting task of determining the correct legal venue for their claims, a challenge that has proved insurmountable for many, leading to denied justice and unenforced rights.
By transferring all data protection provisions from the courts to more specialised tribunals, which are better equipped to handle such cases, and clarifying the right-to-appeal decisions made by the commissioner, these amendments seek to eliminate unnecessary legal barriers. Many individuals, often representing themselves and lacking legal expertise, face the daunting challenge of navigating complex legal landscapes, deterred by high legal costs and the intricate determination of appropriate venues for their claims. This shift will not only reduce the financial burden on individuals but enhance the efficiency and effectiveness of the judicial process concerning data protection. By simplifying the legal landscape, we can safeguard individual rights more effectively and foster a more trustworthy digital environment.
I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.
The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.
As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.
On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.
Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.
My Lords, I thank the Minister for his response to my amendments and welcome him to the Dispatch Box and a whole world of pain on the Data (Use and Access) Bill, as he has, no doubt, noted already after just two hours’ worth of this Committee.
I found his response disappointing, and I think both he and the noble Viscount, Lord Camrose, have misunderstood the nature of this situation. This is not a blend, which is all beautifully logical depending on the nature of the case. This is an absolute mishmash where the ordinary litigant is faced with great confusion, not knowing quite often whether to go to the court or a tribunal, where the judges themselves have criticised the confusion and where there appears to be no appetite, for some reason, in government for a review of the jurisdictions.
I felt that the noble Viscount was probably reading from his previous ministerial brief. Perhaps he looked back at Hansard for what he said on the DPDI Bill. It certainly sounded like that. The idea that the courts are peerless in their legal interpretation and the poor old tribunals really just do not know what they are doing is wrong. They are expert tribunals, you can appear before them in person and there are no fees. It is far easier to access a tribunal than a court and certainly, as far as appeals are concerned, the idea that the ordinary punter is going to take judicial review proceedings, which seems to be the implication of staying with the current system on appeals if the merits of the ICO’s decisions are to examined, seems quite breathtaking. I know from legal practice that JR is not cheap. Appearing before a tribunal and using that as an appeal mechanism would seem far preferable.
I will keep on pressing this because it seems to me that at the very least the Government need to examine the situation to have a look at what the real objections are to the jurisdictional confusion and the impact on data subjects who wish to challenge decisions. In the meantime, I beg leave to withdraw the amendment.
My Lords, I beg to move Amendment 110 and will speak to Amendments 112, 114, 120, 121, 122, 123 and Clause 80 stand part. As we have heard, artificial intelligence and algorithmic and automated decision-making tools, are increasingly being used across the public sector to make and support many of the highest impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life.
The Committee will be pleased to hear that I will not repeat the contents of my speech on my Private Member’s Bill on this subject last Friday. But the fact remains that the rapid adoption of AI in the public sector presents significant risks and challenges, including: the potential for unfairness, discrimination and misuse, as demonstrated by scandals such as the UK’s Horizon and Australia’s Robodebt cases; automated decisions that are prone to serious error; lack of transparency and accountability in automated decision-making processes; privacy and data protection concerns; algorithmic bias; and the need for human oversight.
My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.
I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.
I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.
Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that
“the human involvement has to be active and not just a token gesture”;
that right is absolutely underpinned by the wording of the regulations here.
I turn next to Amendment 111. I can assure—
My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?
Yes. The provisions in this Bill cover exactly that concern.
The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?
Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.
On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.
On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.
Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.
On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.
I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.
I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.
Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.
I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.
My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.
That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:
“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.
He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.
Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?
My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.
It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.
We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.
The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.
There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.
My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.
However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.
As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.
By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.
Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.
Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.
In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.
This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.
Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.
I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.
Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.
Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.
Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.
The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.
My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.
Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.
This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
Yes, we could not see the noble Lord’s raised eyebrows.
Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.
We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.
I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.
My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.
I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.
No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.
I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.
My Lords, in moving Amendment 134—it is the lead amendment in this group—I shall speak to the others in my name and my Clause 92 stand part notice. Many of the amendments in this group stem from concerns that the new structure for the ICO will diminish its independence. The ICO is abolished in favour of the commission.
My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.
The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.
I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.
Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.
Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.
I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.
Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.
Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.
I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.
Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.
Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.
Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.
My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.
The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.
It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.
I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.
There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.
(2 weeks ago)
Lords ChamberMy Lords, I declare my AI interests as set out in the register. I thank Big Brother Watch, the Public Law Project and the Ada Lovelace Institute, which, each in their own way, have provided the evidence and underpinned my resolve to ensure that we regulate the adoption of algorithmic and AI tools in the public sector, which are increasingly being used across it to make and support many of the highest-impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life. I also thank the Public Bill Office, the Library and other members of staff for all their assistance in bringing this Bill forward and communicating its intent and contents, and I thank all noble Lords who have taken the trouble to come to take part in this debate this afternoon.
The speed and volume of decision-making that new technologies will deliver is unprecedented. They have the potential to offer significant benefits, including improved efficiency and cost effectiveness in government operations, enhanced service delivery and resource allocation, better prediction and support for vulnerable people and increased transparency in public engagement. However, the rapid adoption of AI in the public sector also presents significant risks and challenges, with the potential for unfairness, discrimination and misuse through algorithmic bias and the need for human oversight, a lack of transparency and accountability in automated decision-making processes and privacy and data protection concerns.
Incidents such as the 2020 A-level and GCSE grading fiasco, where an algorithmic approach saw students, particularly those from lower-income areas, unfairly miss out on university places when an algorithm was used to estimate grades from exams that were cancelled because of Covid-19, have starkly illustrated the dangers of unchecked algorithmic systems in public administration disproportionately affecting those from lower-income backgrounds. That led to widespread public outcry and a loss of trust in government use of technology.
Big Brother Watch’s investigations have revealed that councils across the UK are conducting mass profiling and citizen scoring of welfare and social care recipients. Its report, entitled Poverty Panopticon [The Hidden Algorithms Shaping Britain’s Welfare State], uncovered alarming statistics. Some 540,000 benefits applicants are secretly assigned fraud risk scores by councils’ algorithms before accessing housing benefit or council tax support. Personal data from 1.6 million people living in social housing is processed by commercial algorithms to predict rent non-payers. Over 250,000 people’s data is processed by secretive automated tools to predict the likelihood of abuse, homelessness or unemployment.
Big Brother Watch criticises the nature of these algorithms, stating that most are secretive, unevidenced, incredibly invasive and likely discriminatory. It argues that these tools are being used without residents’ knowledge, effectively creating tools of automated suspicion. The organisation rightly expressed deep concern that these risk-scoring algorithms could be disadvantaging and discriminating against Britain’s poor. It warns of potential violations of privacy and equality rights, drawing parallels to controversial systems like the Metropolitan Police’s gangs matrix database, which was found to be operating unlawfully. From a series of freedom of information requests last June, Big Brother Watch found that a flawed DWP algorithm wrongly flagged 200,000 housing benefit claimants for possible fraud and error, which meant that thousands of UK households every month had their housing benefit claims unnecessarily investigated.
In August 2020, the Home Office agreed to stop using an algorithm to help sort visa applications after it was discovered that the algorithm contained entrenched racism and bias, and following a challenge from the Joint Council for the Welfare of Immigrants and the digital rights group Foxglove. The algorithm essentially created a three-tier system for immigration, with a speedy boarding lane for white people from the countries most favoured by the system. Privacy International has raised concerns about the Home Office's use of a current tool called Identify and Prioritise Immigration Cases—IPIC—which uses personal data, including biometric and criminal records to prioritise deportation cases, arguing that it lacks transparency and may encourage officials to accept recommended decisions without proper scrutiny.
Automated decision-making has been proven to lead to harms in privacy and equality contexts, such as in the Harm Assessment Risk Tool, which was used by Durham Police until 2021, and which predicted reoffending risks partly based on an individual’s postcode in order to inform charging decisions. All these cases illustrate how ADM can perpetuate discrimination. The Horizon saga illustrates how difficult it is to secure proper redress once the computer says no.
There is no doubt that our new Government are enthusiastic about the adoption of AI in the public sector. Both the DSIT Secretary of State and Feryal Clark, the AI Minister, are on the record about the adoption of AI in public services. They have ambitious plans to use AI and other technologies to transform public service delivery. Peter Kyle has said:
“We’re putting AI at the heart of the government’s agenda to boost growth and improve our public services”,
and
“bringing together digital, data and technology experts from across Government under one roof, my Department will drive forward the transformation of the state”.—[Official Report, Commons, 2/9/24; col. 89.]
Feryal Clarke has emphasised the Administration’s desire to “completely transform digital Government” with DSIT. As the Government continue to adopt AI technologies, it is crucial to balance the potential benefits with the need for responsible and ethical implementation to ensure fairness, transparency and public trust.
The Ada Lovelace Institute warns of the unintended consequences of AI in the public sector, including the risk of entrenching existing practices, instead of fostering innovation and systemic solutions. As it says, the safeguards around automated decision-making, which exist only in data protection law, are therefore more critical than ever in ensuring people understand when a significant decision about them is being automated, why that decision is made, and have routes to challenge it, or ask for it to be decided by a human.
Our citizens need greater, not less, protection, but rather than accepting the need for these, we see the Government following in the footsteps of their predecessor by watering down such rights as there are under GDPR Article 22 not to be subject to automated decision-making. We will, of course, be discussing these aspects of the Data (Use and Access) Bill in Committee next week.
ADM safeguards are critical to public trust in AI, but progress has been glacial. Take the Algorithmic Transparency Recording Standard, which was created in 2022 and is intended to offer a consistent framework for public bodies to publish details of the algorithms used in making these decisions. Six records were published at launch, and only three more seem to have been published since then. The previous Government announced earlier this year that the implementation of the Algorithmic Transparency Recording Standard will be mandatory for departments. Minister Clark in the new Government has said,
“multiple records are expected to be published soon”,
but when will this be consistent across government departments? What teeth do the Central Digital and Data Office and the Responsible Technology Adoption Unit, now both within DSIT, have to ensure the adoption of the standard, especially in view of the planned watering down of the Article 22 GDPR safeguards? Where is the promised repository for ATRS records? What about the other public services in local government too?
The Public Law Project, which maintains a register called Tracking Automated Government, believes that in October last year there were more than 55 examples of public ADM systems use. Where is the transparency on those? The fact is that the Government’s Algorithmic Transparency Recording Standard, while a step in the right direction, remains voluntary and lacks comprehensive adoption or indeed a compliance mechanism or opportunity for redress. The current regulatory landscape is clearly inadequate to address these challenges. Despite the existing guidance and framework, there is no legally enforceable obligation on public authorities to be transparent about their use of ADM and algorithmic systems, or to rigorously assess their impact.
To address these challenges, several measures are needed. We need to see the creation of and adherence to ethical guidelines and accountability mechanisms for AI implementation; a clear regulatory framework and standards for use in the public sector; increased transparency and explainability of the adoption and use of AI systems; investment in AI education; and workforce development for public sector employees. We also need to see the right of redress, with a strengthened right for the individuals to challenge automated decisions.
My Bill aims to establish a clear mandatory framework for the responsible use of algorithmic and automated decision-making systems in the public sector. It will help to prevent the embedding of bias and discrimination in administrative decision-making, protect individual rights and foster public trust in government use of new technologies.
I will not adumbrate all the elements of the Bill. In an era when AI and algorithmic systems are becoming increasingly central to government ambitions for greater productivity and public service delivery, this Bill, I hope noble Lords agree, is crucial to ensuring that the benefits of these technologies are realised while safeguarding democratic values and individual rights. By ensuring that ADM systems are used responsibly and ethically, the Bill facilitates their role in improving public service delivery, making government operations more efficient and responsive.
The Bill is not merely a response to past failures but a proactive measure to guide the future use of technology within government and empower our citizens in the face of these powerful new technologies. I hope that the House and the Government will agree that this is the way forward. I beg to move.
My Lords, I thank the Minister for her response and all noble Lords who have taken part in this debate, which I thought was perfectly formed and very expert. I was interested in the fact that the noble Baroness, Lady Lane-Fox, has a role in the digital centre for government and in what she had to say about what might be desirable going forward, particularly in the areas of skills and procurement. The noble Baroness, Lady Freeman, said much the same, which indicates something to me.
By the way, I think the Minister has given new meaning to the word “reservations”. That was the most tactful speech I have heard for a long time. It is a dangerous confidence if the Government really think that the ATRS, combined with the watered-down ADM provisions in the GDPR, are going to be enough. They are going to reap the whirlwind if they are not careful, with public trust being eroded. We have seen what has happened in the NHS: unless you are absolutely on the case on this, you will see 3.3 million people opt out of sharing their data, as in the NHS. This is something live; it erupts without due warning.
The examples I gave show a pretty dangerous use of ADM systems. Big Brother Watch has gone into some detail on the particular models that I illustrated. If the Government think that the ATRS is adequate, alongside their watered-down GDPR provisions, then, as I said, they are heading for considerable problems.
As the noble Lord, Lord Knight, can see, if the Government have reservations about my limited Bill, they will have even more reservations about anything more broad.
I do not want to tread on the toes of the noble Lord, Lord Holmes, who I am sure will come back with another Bill at some stage, but I am very sympathetic to the need for algorithmic impact assessment, particularly in the workplace, as advocated by the Institute for the Future of Work. We may be inflicting more amendments on the Minister when the time comes in the ADM Bill.
This Bill is, as the noble Baroness, Lady Lane-Fox, mentioned, based on the Canadian experience. It is based on a Canadian directive that is now well under way and is perfectly practical.
The warning of the noble Lord, Lord Tarassenko, about the use of large language models, with their unpredictability and inability to produce the same result, was an object lesson in the need for proper understanding and training within the Civil Service in the future, and for the development of open source-type LLMs on the back of the existing large language models that are out there, to make sure that they are properly trained and tested as a sovereign capacity.
It is clear that I am not going to get a great deal further. I am worried that we are going to see a continuation, in the phrase used by my noble friend Lady Hamwee, of the culture of deference: the machine is going to continue saying no and our citizens will continue to be unable to challenge decisions in an effective way. That will lead to further trouble.
I thank the noble Viscount, Lord Camrose, for his in-principle support. If the Bill is to have a Committee stage, I look forward to debating some of the definitions. In the meantime, I commend the Bill to the House.
(2 weeks, 3 days ago)
Grand CommitteeMy Lords, I support the amendments from the noble Viscount, Lord Colville, which I have signed, and will put forward my Amendments 64, 68, 69, 130 and 132 and my Clause 85 stand part debate.
This part of the GDPR is a core component of how data protection law functions. It makes sure that organisations use personal data only for the reason that it was collected. One of the exceptional circumstances is scientific research. Focus on the definitions and uses of data in research increased in the wake of the Covid-19 pandemic, when some came to the view that legal uncertainty and related risk aversion were a barrier to clinical research.
There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or very narrow distinctions between the original and new purpose. The Government’s position seems to be that the Bill will only clarify the law, incorporating recitals to the original GDPR in the legislation. While this may be the policy intention, the Bill must be read in the context of recent developments in artificial intelligence and the practice of AI developers.
The Government need to provide reassurance that the intention and impact of the research provisions are not to enable the reuse of personal data, as the noble Viscount said, scraped from the internet or collected by tech companies under legitimate interest for training AI. Large tech companies could abuse the provisions to legitimise mass data scraping of personal data from the internet or to collect via legitimate interest—for example, by a social media platform, about its users. This could be legally reused for training AI systems under the new provisions if developers can claim that it constitutes scientific research. That is why we very much support what the noble Viscount said.
In our view, the definition of scientific research adopted in the Bill is too broad and will permit abuse by commercial interests outside the policy intention. The Bill must recognise the reality that companies will likely position any AI development as “reasonably described as scientific”. Combined with the inclusion of commercial activities in the Bill, that opens the door to data reuse for any data-driven product development under the auspices that it represents scientific research, even where the relationship to real scientific progress is unclear or tenuous. That is not excluded in these provisions.
I turn to Amendments 64, 68, 69, 130 and 132 and the Clause 85 stand part debate. The definition of scientific research in proposed new paragraph 2 under Clause 67(1)(b) is drawn so broadly that most commercial development of digital products and services, particularly those involving machine learning, could ostensibly be claimed by controllers to be “reasonably described as scientific”. Amendment 64, taken together with those tabled by the noble Viscount that I have signed, would radically reduce the scope for misuse of data reuse provisions by ensuring that controllers cannot mix their commercial purposes with scientific research and that such research must be in the public interest and conducted in line with established academic practice for genuine scientific research, such as ethics approval.
Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.
The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.
Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.
I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.
Can the Minister say whether this will be a Bill, a draft Bill or a consultation?
We will announce this in the usual way—in due course. I refer the noble Lord to the King’s Speech on that issue. I feel that noble Lords want more information, but they will just have to go with what I am able to say at the moment.
Perhaps another aspect the Minister could speak to is whether this will be coming very shortly, shortly or imminently.
Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.
On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.
We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.
My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.
If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.
The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill
“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]
I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.
Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.
Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:
“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”
Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.
The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.
As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.
So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.
This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.
I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.
If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.
The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.
It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.
Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.
My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.
I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.
Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so
“is impossible or would involve a disproportionate effort”.
We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted
“for the purposes of scientific research or historical research”.
This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.
Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.
My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.
In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.
The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.
I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.
So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I start with an apology, because almost every amendment in this group is one of mine and I am afraid I have quite a long speech to make about the different amendments, which include Amendments 73, 75, 76, 77, 78, 78A, 83, 84, 85, 86, 89 and 90, and stand part debates on Schedules 4, 5 and 7 and Clause 74. But I know that the Members of this Committee are made of strong stuff.
Clause 70 and Schedule 4 introduce a new ground of recognised legitimate interest, which in essence counts as a lawful basis for processing if it meets any of the descriptions in the new Annexe 1 to the UK GDPR, which is at Schedule 4 to the Bill—for example, processing necessary for the purposes of responding to an emergency or detecting crime. These have been taken from the previous Government’s Data Protection and Digital Information Bill. This is supposed to reduce the burden on data controllers and the cost of legal advice when they have to assess whether it is okay to use or share data or not. Crucially, while the new ground shares its name with “legitimate interest”, it does not require the controller to make any balancing test taking the data subject’s interests into account. It just needs to meet the grounds in the list. The Bill gives the Secretary of State powers to define additional recognised legitimate interests beyond those in Annexe 1—a power heavily criticised by the Delegated Powers and Regulatory Reform Committee’s report on the Bill.
Currently where a private body shares personal data with a public body in reliance on Article 6(1)(e) of the GDPR, it can rely on the condition that the processing is
“necessary for the performance of a task carried out in the public interest”.
New conditions in Annexe 1, as inserted by Schedule 4, would enable data sharing between the private and public sectors to occur without any reference to a public interest test. In the list of recognised legitimate interests, the most important is the ability of any public body to ask another controller, usually in the private sector, for the disclosure of personal data it needs to deliver its functions. This applies to all public bodies. The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped.
Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests
“has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”.
The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said:
“We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”.
An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances.
When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests. Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so.
During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives. She rightly said:
“There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.]
She never spoke a truer word.
However, this Government have reintroduced the same extra power with no new articulation of any strong reason for needing it. The constraints placed on the Secretary of State are slightly higher in this Bill than they were in the DPDI Bill, as new paragraph (9), inserted by Clause 70(4), means that they able to add new recognised legitimate interests only if they consider processing the case to be necessary to safeguard an objective listed in UK GDPR Article 23(1)(c) to (j). However, this list includes catch-alls, such as
“other important objectives of general public interest”.
To give an example of what this power would allow, the DPDI Bill included a recognised legitimate interest relating to the ability of political parties to use data about citizens during election campaigns on the basis that democratic participation is an objective of general public interest. I am glad to say that this is no longer included. Another example is that a future Secretary of State could designate workplace productivity as a recognised legitimate interest—which, without a balancing test, would open the floodgates to intrusive workplace surveillance and unsustainable data-driven work intensification. That does not seem to be in line with the Government’s objectives.
Amendment 74 is rather more limited. Alongside the BMA, we are unclear about the extent of the impact of Clause 70 on the processing of health data. It is noted that the recognised legitimate interest avenue appears to be available only to data controllers that are not public authorities. Therefore, NHS organisations appear to be excluded. We would welcome confirmation that health data held by an NHS data controller is excluded from the scope of Clause 70 now and in the future, regardless of the lawful basis that is being relied on to process health data.
My Lords, when the noble Lord, Lord Clement-Jones, opened his speech he said that he hoped that noble Lords would be made of strong stuff while he worked his way through it. I have a similar request regarding my response: please bear with me. I will address these amendments slightly out of order to ensure that related issues are grouped together.
The Schedule 4 stand part notice, and Amendments 73 and 75, tabled by the noble Lord, Lord Clement-Jones, and supported by the noble Baroness, Lady Kidron, would remove the new lawful ground of “recognised legitimate interests” created by Clause 70 and Schedule 4 to the Bill. The aim of these provisions is to give data controllers greater confidence about processing personal data for specified and limited public interest objectives. Processing that is necessary and proportionate to achieve one of these objectives can take place without a person’s consent and without undertaking the legitimate interests balancing test. However, they would still have to comply with the wider requirements of data protection legislation, where relevant, ensuring that the data is processed in compliance with the other data protection principles.
I say in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.
The activities listed include processing of data where necessary to prevent crime, safeguarding national security, protecting children or responding to emergencies. They also include situations where a public body requests that a non-public body share personal data with it to help deliver a public task that is sanctioned by law. In these circumstances, it is very important that data is shared without delay, and removal of these provisions from the Bill, as proposed by the amendment, could make that harder.
Amendment 74, tabled by noble Lord, Lord Scriven, would prevent health data being processed as part of this new lawful ground, but this could have some unwelcome effects. For example, the new lawful ground is designed to give controllers greater confidence about reporting safeguarding concerns, but if these concerns relate to a vulnerable person’s health, they would not be able to rely on the new lawful ground to process the data and would have to identify an alternative lawful ground.
On the point made by the noble Lord, Lord Clement-Jones, about which data controllers can rely on the new lawful ground, it would not be available to public bodies such as the NHS; it is aimed at non-public bodies.
I reassure noble Lords that there are still sufficient safeguards in the wider framework. Any processing that involves special category data, such as health data, would also need to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the Data Protection Act 2018.
Amendment 78A, tabled by the noble Lord, Lord Clement-Jones, would remove the new lawful ground for non-public bodies or individuals to disclose personal data at the request of public bodies, where necessary, to help those bodies deliver their public interest tasks without carrying out a legitimate interest balance test. We would argue that, without it, controllers may lack certainty about the correct lawful ground to rely on when responding to such requests.
Amendment 76, also tabled by the noble Lord, Lord Clement-Jones, would remove the powers of regulations in Clause 70 that would allow the Secretary of State to keep the list of recognised legitimate interests up to date. Alternatively, the noble Lord’s Amendment 78 would require the Secretary of State to publish a statement every time he added a new processing activity to the list, setting out its purpose, which controllers it was aimed at and for how long they can use it. I reassure the noble Lord that the Government have already taken steps to tighten up these powers since the previous Bill was considered by this House.
Any new processing activities added would now also have to serve
“important objectives of … public interest”
as described in Article 23.1 of the UK GDPR and, as before, new activities could be added to the list only following consultation with the ICO and other interested parties. The Secretary of State would also have to consider the impact of any changes on people’s rights and have regard to the specific needs of children. Although these powers are likely to be used sparingly, the Government think it important that they be retained. I reassure the Committee that we will be responding to the report from the Delegated Powers Committee within the usual timeframes and we welcome its scrutiny of the Bill.
The noble Lord’s Amendment 77 seeks to make it clear that organisations should also be able to rely on Article 6.1(f) to make transfers between separate businesses affiliated by contract. The list of activities mentioned in Clause 70 is intended to be illustrative only and is drawn from the recitals to the UK GDPR. This avoids providing a very lengthy list that might be viewed as prescriptive. Article 6.1(f) of the UK GDPR is flexible. The transmission of personal data between businesses affiliated by contract may constitute a legitimate interest, like many other commercial interests. It is for the controller to determine this on a case-by-case basis.
I will now address the group of amendments tabled by the noble Lord, Lord Clement-Jones, concerning the purpose limitation principle, specifically Amendments 83 to 86. This principle limits the ways that personal data collected for one purpose can be used for another, but Clause 71 aims to provide more clarity and certainty around how it operates, including how certain exemptions apply.
Amendment 84 seeks to clarify whether the first exemption in proposed new Annexe 2 to the UK GDPR would allow personal data to be reused for commercial purposes. The conditions for using this exemption are that the requesting controller has a public task or official authority laid down in law that meets a public interest objective in Article 23.1 of the UK GDPR. As a result, I and the Government are satisfied that these situations would be for limited public interest objectives only, as set out in law.
Amendments 85 and 86 seek to introduce greater transparency around the use of safeguarding exemptions in paragraph 8 of new Annexe 2. These conditions are drawn from the Care Act 2014 and replicated in the existing condition for sensitive data processing for safeguarding purposes in the Data Protection Act 2018. I can reassure the Committee that processing cannot occur if it does not meet these conditions, including if the vulnerability of the individual no longer exists. In addition, requiring that an assessment be made and given to the data subject before the processing begins could result in safeguarding delays and would defeat the purpose of this exemption.
Amendment 83 would remove the regulation-making powers associated with this clause so that new exceptions could not be added in future. I remind noble Lords that there is already a power to create exemptions from the purpose limitation principle in the DPA 2018. This Bill simply moves the existing exemptions to a new annexe to the UK GDPR. The power is strictly limited to the public objectives listed in Article 23.1 of the UK GDPR.
I now turn to the noble Lord’s Amendment 89, which seeks to set conditions under which pseudonymised data should be treated as personal data. This is not necessary as pseudonymised data already falls within the definition of personal data under Article 4.1 of the UK GDPR. This amendment also seeks to ensure that a determination by the ICO that data is personal data applies
“at all points in that processing”.
However, the moment at which data is or becomes personal should be a determination of fact based on its identifiability to a living individual.
I turn now to Clause 74 stand part, together with Amendment 90. Noble Lords are aware that special categories of data require additional protection. Article 9 of the UK GDPR sets out an exhaustive list of what is sensitive data and outlines processing conditions. Currently, this list cannot be amended without primary legislation, which may not always be available. This leaves the Government unable to respond swiftly when new types of sensitive data are identified, including as a result of emerging technologies. The powers in Clause 74 enable the Government to respond more quickly and add new special categories of data, tailor the conditions applicable to their use and add new definitions if necessary.
Finally, I turn to the amendment tabled by the noble Lord, Lord Clement-Jones, that would remove Schedule 7 from the Bill. This schedule contains measures to create a clearer and more outcomes-focused UK international data transfers regime. As part of these reforms, this schedule includes a power for the Secretary of State to recognise new transfer mechanisms for protecting international personal data transfers. Without this, the UK would be unable to respond swiftly to emerging developments and global trends in personal data transfers. In addition, the ICO will be consulted on any new mechanisms, and they will be subject to debate in Parliament under the affirmative resolution procedure.
I hope this helps explain the Government’s intention with these clauses and that the noble Lord will feel able to withdraw his amendment.
My Lords, I thank the Minister. She covered quite a lot of ground and all of us will have to read Hansard quite carefully. However, it is somewhat horrifying that, for a Bill of this size, we had about 30 seconds from the Minister on Schedule 7, which could have such a huge influence on our data adequacy when that is assessed next year. I do not think anybody has talked about international transfers at this point, least of all me in introducing these amendments. Even though it may appear that we are taking our time over this Bill, we are not fundamentally covering all its points. The importance of this Bill, which obviously escapes most Members of this House—there are just a few aficionados—is considerable and could have a far-reaching impact.
I still get Viscount Camrose vibes coming from the Minister.
Perhaps I should stay that this kind of enthusiasm clearly conquers all. I should thank a former Minister, the noble Lord, Lord Kamall, and I thank the noble Baroness, Lady Kidron, for her thoughtful speech, particularly in questioning the whole recognised legitimate interest issue, especially in relation to vulnerable individuals.
It all seems to be a need for speed, whether it is the Secretary of State who has to make snappy decisions or a data controller. We are going to conquer uncertainty. We have to keep bustling along. In a way, to hell with individual data rights; needs must. I feel somewhat Canute-like holding up the barrier of data that will be flowing across us. I feel quite uncomfortable with that. I think the DPRRC is likewise going to feel pretty cheesed off.
My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.
Additional safeguards are required for the protection of children’s data. This amendment
“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.
The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.
For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.
There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.
Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.
The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:
“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.
As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.
During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.
The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.
My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.
Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.
This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.
First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.
In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.
Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.
Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.
I thank the Minister for her response. I should say at the outset that, although I may have led the group, it is clear that the noble Baroness, Lady Kidron, leads the pack as far as this is concerned. I know that she wants me to say that the noble Baroness, Lady Harding, wished to say that she was extremely sorry not to be able to attend as she wanted to associate herself wholeheartedly with these amendments. She said, “It’s so disappointing still to be fighting for children’s data to have higher protection but it seems that that’s our lot!” I think she anticipated the response, sadly. I very much thank the noble Baroness, Lady Kidron, the noble Lords, Lord Russell and Lord Stevenson, and the noble Viscount, Lord Camrose, in particular for his thoughtful response to Amendment 196.
I was very interested in the intervention from the noble Lord, Lord Stevenson, and wrote down “Not invented here” to sum up the Government’s response to some of these amendments, which has been consistently underwhelming throughout the debates on the DPDI Bill and this Bill. They have brought out such things as “the unintended effects” and said, “We don’t want to interfere with the ICO”, and so on. This campaign will continue; it is really important. Obviously, we will read carefully what the Minister said but, given the troops behind me, I think the campaign will only get stronger.
The Minister did not really deal with the substance of Amendment 196, which was not just a cunning ploy to connect the Bill with the Online Safety Act; it was about current intentions on categorisation. There is considerable concern that the current category 1 is overconservative and that we are not covering the smaller, unsafe social media platforms. When we discussed the Online Safety Bill, both in the Joint Committee and in the debates on subsequent stages of the Bill, it was clear that this was about risk, not just size, and we wanted to cover those risky, smaller platforms as well. While I appreciate the Government’s strategic statement, which made it pretty clear, and without wishing to overly terrorise Ofcom, we should make our view on categorisation pretty clear, and the Government should do likewise.
This argument and debate will no doubt continue. In the meantime, I beg leave to withdraw my amendment.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.
(3 weeks, 3 days ago)
Grand CommitteeJust to follow on from that, I very much support my noble friend’s words. The only reason I can see why you would introduce new definitions is that there are new responsibilities that are different, and you would want people to be aware of the new rules that have been placed on them. I will be interested to hear the Minister’s answer. If that is the case, we can set that out and understand whether the differences are so big that you need a whole new category, as my noble friend said.
Having run lots of small businesses myself, I am aware that, with every new definition that you add, you add a whole new set of rules and complications. As a business owner, how am I going to find out what applies to me and how I am to be responsible? The terms trader, controller, data holder and processor all sound fairly similar, so how will I understand what applies to me and what does not? To the other point that my noble friend made, the more confusing it gets, the less likelihood there is that people will understand the process.
My Lords, I am not sure whether I should open by saying that it is a pleasure to take part in the passage of the third iteration of this Bill, but, as I said at Second Reading, this is an improvement. Nevertheless, there are aspects of the Bill that need close scrutiny.
The noble Viscount, Lord Camrose, explained his approach to this Bill. Our approach is that we very much support the use of data for public benefit but, at the same time, we want to make sure that this Bill does not water down individual data rights and that they are, where necessary, strengthened. In that spirit, I wish to ask the Minister about the general nature of Clause 1, rather than following up on the amendments tabled by the noble Viscount.
The definition of “business data” seems quite general. A report that came out yesterday, Data On Our Minds: Affective Computing At Work, highlighted the kinds of data that are now being collected in the workplace. It is a piece of work sponsored by the Joseph Rowntree Charitable Trust, the Trust for London and the Institute for the Future of Work. They are concerned about the definition of “business data”. The Minister probably will not have an answer on this matter at this stage but it would be useful if she could write in due course to say whether the definition of excludes emotional data and neurosurveillance data collected from employees.
This is very much a workplace question rather than a question about the customer; I could ask the same question about the customer, I suppose, except the report is about workplace data collection. I thought I would opportunistically take advantage of the rather heavy de-grouping that has taken place and ask the Minister a question.
First, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.
On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.
In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.
My Lords, I would like to say a few things about this. The first is that Amendment 5, in the name of the noble Lord, Lord Lucas, is very sensible; sometimes the GDPR has gone too far in trying to block what you can use things for. It was originally thought of when so much spamming was going on, with people gathering data from adverts and all sorts of other things and then misusing it for other purposes. People got fed up with the level of spam. This is not about that sort of thing; it is about having useful data that would help people in the future, and which they would not mind being used for other purposes. As long as it is done properly and seriously, and not for marketing, advertising and all those other things, and for something which is useful to people, I cannot see what the problem is. An overzealous use of GDPR, which has happened from time to time, has made it very difficult to use something perfectly sensible, which people would not mind having other people know about when it is being useful.
The next matter is sex, which is an interesting issue. The noble Lord is absolutely correct that biological or genetic sex is vital when applying medicines and various other things. You have to know that you are administering certain drugs properly. As we get more and more new drugs coming on, it will matter how a person’s body will react to them, which will depend on the genetic material, effectively. Therefore, it is essential to know what the biological sex is. The answer is that we need another category—probably “current gender”—alongside “sex at birth”. Someone can then decide to use “current gender” for certain purposes, including for such things as passports and driving licences, where people do not want to be asked questions—“Oh, do you mean you’re not?”—because they look completely different.
I remember meeting April Ashley in her restaurant. I would not, in my innocence—I was quite young—have guessed that she was not a woman, except that someone said that her hands were very big. It never worried us in those days. I am not worried about people using a different gender, but the basic underlying truth is essential. It comes into the issue of sport. If you have grown up and developed physically as a biological male, your bone structure and strength are likely to be different from that of a female. There are huge issues with that, and we need to know both; people can decide which to use at certain points. Having both would give you the flexibility to do that.
That also applies to Amendment 200, from the noble Lord, Lord Lucas, which is exactly the same concept. I thoroughly agree with those amendments and think we should push them forward.
My Lords, I too am delighted that the noble Lord, Lord Lucas, came in to move his amendment. He is the expert in that whole area of education data; like the noble Lord, Lord Arbuthnot, I found what he said extremely persuasive.
I need to declare an interest as chair of the council of Queen Mary, University of London, in the context of Amendment 5 in the name of the noble Lord, Lord Lucas. I must say, if use were made of that data, it would benefit not only students but universities. I am sure that the Minister will take that seriously but, on the face of it, like the noble Earl, Lord Erroll, I cannot see any reason why this amendment should not be adopted.
I very much support Amendments 34 and 48 in the name of the noble Lord, Lord Arbuthnot. I too have read the briefing from Sex Matters. The noble Lord’s pursuit of accuracy for the records that will be part of the wallet, if you like, to be created for these digital verification services is a matter of considerable importance. In reading the Sex Matters briefing, I was quite surprised. I had not realised that it is possible to change your stated sex on your passport in the way that has taken place. The noble Lord referred to the more than 3,000 cases of this; for driving licences, there have been more than 15,000.
I agree with Sex Matters when it says that this could lead to a loss of trust in the system. However, I also agree with the noble Earl, Lord Erroll, that this is not an either/or. It could be both. It is perfectly feasible to have both on your passport, if you so choose. I do not see this as a great divide as long as the statement about sex is accurate because, for a great many reasons—not least in healthcare—it is of considerable importance that the statement about one’s sex is accurate.
I looked back at what the Minister said at Second Reading. I admit that I did not find it too clear but I hope that, even if she cannot accept these amendments, she will be able to give an assurance that, under this scheme—after all, it is pretty skeletal; we will come on to some amendments that try to flesh it out somewhat—the information on which it will be based is accurate. That must be a fundamental underlying principle. We should thank the noble Lord, Lord Arbuthnot, for tabling these two important amendments in that respect.
My Lords, I want to come in on Amendment 5. Although I am very much in favour of the intent of what we are trying to do—making more use of the sharing of data—I have to remember my old Health Minister’s hat in talking about all the different terms and speaking to the different angles that we are all coming from.
Noble Lords have heard me speak many a time about the value of our health data and the tremendous possibilities that it offers for drug discovery and all the associated benefits. At the same time, I was very aware of loads of companies purporting to own it. There are GP data companies, which do the systems for GPs and, naturally, hold all the patient data in them. In terms of their business plans, some have been bought for vast sums of money because of the data that they hold. My concern is that, although it is well intended to say that the use of health data should be allowed for the general good, at the same time, I do not believe that GP companies own that data. We have been quite clear on that. I want to make it clear that it is actually the NHS that will benefit from the pulling together of all this, if that happens in those sorts of formats.
Similarly on student loans data—I shall not pretend that this is a subject I know a lot about—I can see a lot of good causes for the student loans, but I can also see that it would be very useful for financial services companies to understand customers’ creditworthiness. In all these cases, although the intent is right, we need to find a way to be clear about what they can and cannot use it for, and there lies a lot of complexity.
My Lords, Amendment 7, the first in this group is a probing amendment and I am extremely grateful to ISACA, an international professional association focused on IT governance, for drafting it. This amendment
“would give the Secretary of State or the Treasury scope to introduce requirements on third party recipients of customer data to publish regular statements on their cyber resilience against specified standards and outcomes”.
Third parties play a vital role in the modern digital ecosystem, providing businesses with advanced technology, specialised expertise and a wide range of services, but integrating third parties into business operations comes with cyber risks. Their access to critical networks and all the rest of it can create vulnerabilities that cyber- criminals exploit. Third parties are often seen as easier targets, with weaker security measures or indirect connections serving as gateways to larger organisations.
Further consideration is to be given to the most effective means of driving the required improvements in cyber risk management, including, in my suggestion, making certain guidance statutory. This is not about regulating and imposing additional cost burdens, but rather creating the environment for digital trust and growth in the UK economy, as well as creating the right conditions for the sustainable use of emerging technologies that will benefit us all. This is something that leading associations and groups such as ISACA have been arguing for.
The Cyber Governance Code of Practice, which the previous Administration introduced, marks an important step towards improving how organisations approach cybersecurity. Its primary goal is to ensure that boards of directors should take their proper responsibility in mitigating cyber risks.
While that code is a positive development, compliance is not legally required, which leaves organisations to decide whether to put their priorities elsewhere. As a result, the code’s effectiveness in driving widespread improvements in cyber resilience will largely depend on their organisation’s willingness to recognise its importance. The amendment would require businesses regularly to review and update their cybersecurity strategies and controls, and to stay responsive to evolving threats and technologies, thereby fostering a culture of continuous improvement. In addition, by mandating ongoing assessments of internal controls and risk-management processes, organisations will be better able to anticipate emerging threats and enhance their ability to detect, prevent and respond to cyber incidents. I beg to move.
My Lords, this is a fairly disparate group of amendments. I am speaking to Amendments 8, 9, 10, 24, 30, 31 and 32. In the first instance, Amendments 8, 9, 10 and 30 relate to the question that I asked at Second Reading: where is the ambition to use the Bill to encourage data sharing to support net zero?
The clean heat market mechanism, designed to create a market incentive to grow the number of heat pumps installed in existing premises each year, is set to be introduced after being delayed a year due to backlash from the boiler industry. If government departments and partners had access to sales data of heating appliances, there would be a more transparent and open process for setting effective and realistic targets.
I have been briefed by Ambient, a not-for-profit organisation in this field. It says that low visibility of high power-consuming assets makes it challenging to maintain grid stability in a clean-power world. Low visibility and influence over future installations of high power-consuming assets make it difficult to plan for grid updates. Inability to shift peak electricity demand leads to higher capacity requirements with associated time and cost implications. Giving the Government and associated bodies access to utility-flexible tariff data would enable the Government and utilities to work together to increase availability and uptake of tariffs, leading to lower peak electricity demand requirements.
Knowing which homes have the oldest and least efficient boilers, and giving public sector and partners access to the Gas Safe Register and CORGI data on boiler age at household level, would mean that they could identify and target households and regions, ensuring that available funds go to those most in need. Lack of clarity on future clean heating demand makes it challenging for the industry to scale and create jobs, and to assess workforce needs for growing electricity demand. Better demand forecasting through access to sales data on low-carbon heating appliances would signal when and where electrification was creating need for workforce expansion in grid management and upgrade, as well as identify regional demand for installers and technicians.
The provisions of Part 1 of the Bill contain powers for the Secretary of State to require the sharing of business data to customers and other people of specified description. It does not indicate, however, that persons of specified description could include actors such as government departments, public bodies such as NISO and GB Energy, and Ministers. An expanded list of suggested recipients could overcome this issue, as stated in Amendment 9 in my name. It makes no provision for the format of information sharing—hence, my Amendments 8 and 10.
In summary, my questions to the Minister are therefore on: whether it has been considered how the primary legislation outlined in the Bill could be exercised to accelerate progress towards clean power by 2030; whether climate missions such as clean power by 2030 or achieving net zero are purposes “of a public nature” in relation to the outline provisions for public bodies; and whether specifying the format of shared business data would enable more efficient and collaborative use of data for research and planning purposes.
Coming on to Amendments 24, 31 and 32, the Bill expands the potential use of smart data to additional public and private sector entities, but it lacks safeguards for sensitive information regularly used in court. It makes specific provision for legal privilege earlier in the Bill, but this is not extended in provisions relating to smart data. I very much hope that the Government will commit to consult with legal professions before extending smart data to courts.
Many of us support open banking, but open banking is being used, as designed, by landlords to keep watching tenant bank accounts for months after approving their tenancy. Open banking was set up to enhance inter- operability between finance providers, with the most obvious example being the recent new ability of the iPhone wallet app to display balances and recent transactions from various bank accounts.
Open banking approval normally lasts six months. While individual landlords may not choose this access, if given a free choice, the service industry providing the tenant-checking service to landlords is strongly incentivised to maximise such access, otherwise their competitors have a selling point. If open banking is to be added to the statute book, the Bill should mandate that the default time be reduced to no more than 24 hours in the first instance, and reconfirmed much more often. For most one-off approval processes, these access times may be as short as minutes and the regulations should account for that.
Coming on to Amendment 31, consumers have mixed feelings about the potential benefits to them of smart data schemes, as shown in polling such as that carried out a couple of years ago by Deltapoll with the CDEI, now the Responsible Technology Adoption Unit, as regards the perceived potential risks versus the benefits. Approximately one-quarter of respondents in each case were unsure about this trade-off. Perhaps unsurprisingly, individuals who said that they trusted banks and financial institutions or telecommunications providers were more likely to support open finance and open communications, and customers who had previous experience of switching services more frequently reported believing that the benefits of smart data outweighed the risks.
Is it therefore the Government’s expectation that people should be compelled to use these services? Open banking and imitators can do a great deal of good but can also give easy access to highly sensitive data for long periods. The new clause introduced by Amendment 31 would make it the same criminal offence to compel unnecessary access under these new provisions as it already is to compel data provision via subject access requests under the existing Data Protection Act.
Amendment 32 is a probing amendment as to the Government’s intentions regarding these new smart data provisions. In the Minister’s letter of 27 November, she said:
“The Government is working closely to identify areas where smart data schemes might be able to bring benefits. We want to build on the lessons learned from open banking and establish smart data schemes in other markets for goods and services.”
I very much hope that the Minister will be able to give us a little taste of what she thinks these powers are going to be used for, and in what sectors the Government believe that business can take advantage of these provisions.
My Lords, I support Amendment 7 introduced by my noble friend Lord Arbuthnot, for the reasons that he gave. The amendment was designed to have the effect of increasing the reliability and handling of information inside any system. If, as I would certainly support, we want to see information and data in digital form circulated more readily, more freely and more often, it is very important that people should trust the system within which it happens. That is where the need to assure the cybersecurity of the system becomes very important and is a companion note to this Bill.
Does the Minister have any thoughts about where smart data schemes might be introduced? I am sure that they are being introduced for a purpose. Is there a plan to issue a policy document or is it purely about consulting different sectors? Perhaps the Minister can give us a glimpse of the future.
The noble Lord is tempting me. What I would say is that, once this legislation is passed, it will encourage departments to look in detail at where they think smart data schemes can be applied and provide a useful service for customers and businesses alike. I know that one issue that has been talked about is providing citizens with greater information about their energy supplies—the way that is being used and whether they can use their energy differently or find a different supplier—but that is only one example, and I do not want people to get fixated on it.
The potential is enormous; I feel that we need to encourage people to think creatively about how some of these provisions can be used when the Bill is finally agreed. There is a lot of cross-government thinking at the moment and a lot of considering how we can empower citizens more. I could say a lot off the top of my head but putting it on the record in Hansard would probably be a mistake, so I will not be tempted any more by the noble Lord. I am sure that he can write to me with some suggestions, if he has any.
My Lords, I almost have a full house in this group, apart from Amendment 35, so I will not read out the numbers of all the amendments in this group. I should just say that I very much support what the noble Viscount, Lord Colville, has put forward in his Amendment 35.
Many noble Lords will have read the ninth report of the Delegated Powers and Regulatory Reform Committee. I am sad to say that it holds exactly the same view about this Bill as it did about the previous Bill’s provisions regarding digital verification services. It said that
“we remain of the view that the power conferred by clause 28 should be subject to parliamentary scrutiny, with the affirmative procedure providing the appropriate level of scrutiny”.
It is against that backdrop that I put forward a number of these amendments. I am concerned that, although the Secretary of State is made responsible for this framework, in reality, they cannot be accountable for delivering effective governance in any meaningful way. I have tried, through these amendments, to introduce at least some form of appropriate governance.
Of course, these digital verification provisions are long-awaited—the Age Verification Providers Association is pleased to see them introduced—but we need much greater clarity. How is the Home Office compliant with Part 2 of the Bill as it is currently written? How will these digital verification services be managed by DSIT? How will they interoperate with the digital identity verification services being offered by DSIT in the UK Government’s One Login programme?
Governance, accountability and effective, independent regulation are also missing. There is no mechanism for monitoring compliance, investigating malicious actors or taking enforcement action regarding these services. The Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. The Government propose to rely on periodic certification being sufficient but I understand that, when pressed, DSIT officials say that they are talking to certification bodies and regulators about how they can do so. This is not really sufficient. I very much share the intention of both this Government and the previous one to create a market in digital verification services, but the many good players in this marketplace believe that high levels of trust in the sector depend on a high level of assurance and focus from the governance point of view. That is missing in this part of the Bill.
Amendment 33 recognises the fact that the Bill has no mechanism for ongoing monitoring or the investigation of compliance failures. As we have seen from the Grenfell public inquiry, a failure of governance caused by not proactively monitoring, checking and challenging compliance has real, harmful consequences. Digital verification services rely on the trustworthiness of the governance model; what is proposed is not trustworthy but creates material risk for UK citizens and parties who rely on the system.
There are perfectly decent examples of regulatory frameworks. PhonepayPlus provides one such example, with a panel of three experts supported by a secretariat; the panel can meet once a quarter to give its opinion. That has been dismissed as being too expensive, but I do not believe that any costings have been produced or that it has been considered how such a cost would weigh against the consequences of a failure in governance of the kind identified in recent public inquiries.
Again, as regards Amendment 36, there is no mechanism in the Bill whereby accountability is clearly established in a meaningful way. Accountability is critical if relying parties and end-users are to have confidence that their interests are safeguarded.
Amendment 38 is linked to Amendment 36. The review under Clause 31 must be meaningful in improving accountability and effective governance. The amendment proposes that the review must include performance, specifically against the five-year strategy and of the compliance, monitoring and investigating mechanisms. We would also like to see the Secretary of State held accountable by the Science and Technology Select Committee for the performance captured in the review.
On Amendment 41, the Bill is silent on how the Secretary of State will determine that there is a compliance failure. It is critical to have some independence and professional rigour included here; the independent appeals process is really crucial.
As regards Amendments 42 and 43, recent public inquiries serve to illustrate the importance of effective governance. Good practice for effective governance would require the involvement of an independent body in the determination of compliance decisions. There does not appear to be an investigatory resource or expertise within DSIT, and the Bill currently fails to include requirements for investigatory processes or appeals. In effect, there is no check on the authority of the Secretary of State in that context, as well as no requirement for the Secretary of State proactively to monitor and challenge stakeholders on compliance.
As regards Amendment 44, there needs to be a process or procedure for that; fairness requires that there should be a due process of investigation, a review of evidence and a right of appeal to an independent body.
I turn to Amendment 45 on effective governance. A decision by the appeals body that a compliance failure is so severe that removal from the register is a proportionate measure must be binding on the Secretary of State, otherwise there is a risk of lobbying and investment in compliance and service improvement being relegated below that of investment in lobbying. Malicious actors view weaknesses in enforcement as a green light and so adopt behaviours that both put at risk the safety and security of UK citizens and undermine the potential of trustworthy digital verification to drive economic growth.
Amendment 39 would exclude powers in this part being used by government as part of GOV.UK’s One Login.
I come on to something rather different in Amendment 46, which is very much supported by Big Brother Watch, the Digital Poverty Alliance and Age UK. Its theme was raised at Second Reading. A significant proportion of the UK’s population lacks internet access, with this issue disproportionately affecting older adults, children and those from low-income backgrounds. This form of digital exclusion presents challenges in an increasingly digital world, particularly concerning identity verification.
Although digital identity verification can be beneficial, it poses difficulty for individuals who cannot or choose not to engage digitally. Mandating online identity verification can create barriers for digitally excluded groups. For example, the National Audit Office found that only 20% of universal credit applicants could verify their identity online, highlighting concerns for those with limited digital skills. The Lords Communications and Digital Select Committee emphasised the need for accessible, offline alternatives to ensure inclusivity in a connected world. The proponents of this amendment advocate the availability of offline options for essential public and private services, particularly those requiring identity verification. This is crucial as forcing digital engagement can negatively impact the well-being and societal participation of older people.
This is the first time that I have prayed in aid what the Minister said during the passage of the Data Protection and Digital Information Bill; this could be the first of a few such occasions. When we debated the DPDI Bill, she stressed the importance of a legal right to choose between digital and non-digital identity verification methods. I entirely agreed with her at the time. She said that this right is vital for individual liberty, equality and building trust in digital identity systems and that, ultimately, such systems should empower individuals with choices rather than enforce digital compliance. That is a fair summary of what she said at the time.
I turn to Amendment 50. In the context of Clause 45 and the power of public authorities to disclose information, some of which may be the most sensitive information, it is important for the Secretary of State to be able to require the public authority to provide information on what data is being disclosed and where the data is going, as well as why the data is going there. This amendment will ensure that data is being disclosed for the right reasons, to the right places and in the right proportion. I beg to move.
My Lords, I tabled Amendment 35 because I want to make the DVS trust framework as useful as possible. I support Amendment 33 in the name of the noble Lord, Lord Clement-Jones, and Amendment 37 in the name of the noble Viscount, Lord Camrose.
The framework’s mandate is to define a set of rules and standards designed to establish trust in digital identity products in the UK. It is what I would hope for as a provision in this Bill. As the Minister told us at Second Reading, the establishment of digital ID services with a trust mark will increase faith in the digital market and reduce physical checks—not to mention reducing the time spent on a range of activities, from hiring new workers to moving house. I and many other noble Lords surely welcome the consequent reduction in red tape, which so often impedes the effectiveness of our public services.
Clause 28(3) asks the Secretary of State to consult the Information Commissioner and such persons as they consider appropriate. However, in order to ensure that these digital ID services are used and recognised as widely as possible—and, more importantly, that they can be used by organisations beyond our borders— I suggest Amendment 35, which would include putting consultation with an international digital standards body in the Bill. This amendment is supported by the Open Data Institute.
I am sure that the Minister will tell me that that amendment is unnecessary as we can leave it to the common sense of Ministers and civil servants in DSIT to consult such a body but, in my view, it is helpful to remind them that Parliament thinks the consultation of an international standards body is important. The international acceptance of DVS is crucial to its success. Just like an email, somebody’s digital identity should not be tied to a company or a sector. Imagine how frustrating it would be if we could only get Gmail in the UK and Outlook in the EU. Imagine if, in a world of national borders and jurisdictions, you could not send emails between the UK and the EU as a result. Although the DVS will work brilliantly to break down digital identity barriers in the UK, there is a risk that no international standards body might be consulted in the development of the DVS scheme. This amendment would be a reminder to the Secretary of State that there must be collaboration between this country, the EU and other nations, such as Commonwealth countries, that are in the process of developing similar schemes.
I will, of course, write to the noble Baroness.
Was the Minister saying that in view of the current duties of the ICO, Amendment 50 is not needed because public authorities will have the duty to inform the ICO of the information that they have been passing across to these identity services?
Again, I will have to write to the noble Lord on that. I think we were saying that it is outside the current obligations of the ICO, but we will clarify the responsibility.
My Lords, I am not quite sure whether to be reassured or not because this is terra incognita. I am really struggling, given the Minister’s response. This is kind of saying, “Hands off, Parliament, we want the lightest touch on all of this, and the Secretary of State will decide”.
I should first thank the noble Baroness, Lady Kidron, for her support. I thought that the noble Viscount, Lord Colville, made an extremely good case for Amendment 35 because all of us want to make sure that we have that interoperability. One of the few areas where I was reassured by the Minister was on the consultations taking place.
I am sure that the noble Viscount, Lord Camrose, was right to ask what the consultations are. We need to be swimming in the right pool for our digital services to be interoperable. It is not as if we do not have contact with quite a number of these digital service providers. Some of them are extremely good and want a level of mandation for these international services. There is a worrying lack of detail here. We have devil and the deep blue sea. We have these rules on GOV.UK which are far too complicated for mere parliamentarians to comprehend. They are so detailed that we are going to get bogged down.
On the other hand, we do not know what the Secretary of State is doing. This is the detailed trust framework, but what is the governance around it? At the beginning of her speech, the Minister said that governance is different from certification and the conformity assessment service. I would have thought that governance was all part of the same warp and weft. I do not really understand. The Secretary of State has the power to refuse accreditation, so we do not need an independent appeals body. It would be much more straightforward if we knew that there was a regulator and that it was going to be transparent in terms of how the system worked. I just feel that this is all rather half baked at the moment. We need a lot more information than we are getting. To that extent, that is the case for all the amendments in this group.
The crucial amendment is Amendment 37 tabled by the noble Viscount, Lord Camrose, because we absolutely need to bring all this into the light of day by parliamentary approval, whether or not it is a complicated document. Perhaps we could put it through an AI model and simplify it somewhat before we debate it. We have to get to grips with this. I have a feeling that we are going to want to return to this aspect on Report because no good reason has been given, not to the DPRRC either, about why we are not debating this in Parliament in terms of the scheme itself. It is a bit sad to have to say this because we all support the digital verification progress, if you like. Yet, we are all in a bit of a fog about how it is all going to work.
I very much hope that the Minister can come back to us, perhaps with a must-write letter that sets it all out to a much more satisfactory extent. I hope she understands why we have had this fairly protracted debate on this group of amendments because this is an important aspect that the Bill is skeletal about. I beg leave to withdraw the amendment.
My Lords, in moving Amendment 51, I will also speak to Amendments 52, 53, 54 and 209 in my name, which seek to create new criminal offences under the Bill. The first is the offence of using a trust mark without permission; the second is providing false information to the Secretary of State in response to an information notice; and the third is using a false digital identity document, which is really an alternative to digital identity theft.
Clause 50 currently contains no real consequence for a person using a trust mark without permission. A trust mark, which has no specific definition in the Bill, may be used only by those who are in the DVS register. Clause 50(3) says:
“A mark designated under this section may not be used by a person in the course of providing, or offering to provide, digital verification services unless the person is registered in the DVS register in respect of those digital verification services”.
Clause 50(4) then says:
“The Secretary of State may enforce subsection (3)”
by civil injunction or interdict. This has no real teeth in circumstances where there are persistent and flagrant offenders, regardless of whether it is on a personal or commercial scale.
Amendment 51 would give appropriate penalties, with a fine on summary conviction and two years’ imprisonment, or a fine on indictment. Amendment 52 would make provision so that a prosecution may not be brought unless by or with the consent of the appropriate chief prosecutor. Amendment 54 relates to providing false information to the Secretary of State. That is advanced on a similar basis, containing a power for the Secretary of State to require information. Of course, many regulators have this power.
On the issue of false digital identities—identity theft —Amendment 53 is a refinement of the Amendment 289 which I tabled to the late, unlamented DPDI Bill in Committee. That amendment was entitled “Digital identity theft”. I have also retabled the original amendment, but in many ways Amendment 53 is preferable because it is much more closely aligned to the Identity Documents Act, which contains several offences that relate to the use of a person’s identity document. Currently, an identity document includes an immigration document—a passport or similar document—or a driving licence.
My Lords, I thank the Minister. I was quite amused in listening to the noble Viscount, Lord Camrose. I thought about the halcyon days of listening to the language that he used when he was a Minister, with words like “premature”, “unintended consequences”, “disproportionate” and “ambiguity”. I thought, “Come back, Viscount Camrose”—but I appreciate that he took the trouble at least to analyse, from his point of view, where he saw the problems with some of the amendments.
I go back to the physical identity verification aspect. I should have said that I very much hope that the Minister and I can discuss how the Equality Act 2010 has an impact. I am not entirely sure about the protected characteristics playing into this because, obviously, the Equality Act only references those. I think that there could be a greater cohort of people who may be disadvantaged by commercial operators insisting on digital verification, as opposed to physical verification, for instance; I may need to discuss that with the Minister.
I am grateful to the Minister for having gone through where she thinks that there are safeguards and sanctions against using trust identity falsely; that was a very helpful run-through so I shall not go back to what she said. The really important area is this whole offline/online criminal aspect. I understand that it may not be perfect because the scheme is not in place—it may not need to be on all fours exactly with the 2010 Act—but I think that the Minister’s brief was incorrect in this respect. If the Bill team look back at the report from the committee that the noble Baroness, Lady Morgan, chaired back in 2022, Fighting Fraud: Breaking the Chain, they will see that it clearly said:
“Identity theft is often a predicate action to the criminal offence of fraud, as well as other offences including organised crime and terrorism, but it is not a criminal offence”.
That is pretty clear. The committee went into this in considerable detail and said:
“The Government should consult on the introduction of legislation to create a specific criminal offence of identity theft. Alternatively, the Sentencing Council should consider including identity theft as a serious aggravating factor in cases of fraud”.
First, I am going to set the noble Baroness, Lady Morgan, on the noble Viscount, Lord Camrose, to persuade him of the wisdom of creating a new offence. I urge the Minister to think about the consequences of not having any criminal sanction for misuse of digital and identity theft. Whatever you might call it, there must be some way to protect people in these circumstances, if we are going to have public trust in the physical verification framework that we are setting up under this Bill. This will be rolled out—if only I had read GOV.UK, I would be far wiser.
It was very interesting to hear the Minister start to unpack quite a lot of detail. We heard about the new regulator, the Office for Digital Identities and Attributes. That was the first reference to the new regulator, but what are its powers going to be? We need a parliamentary debate on this, clearly. Is this an office delegated by the Secretary of State? Presumably, it is non-statutory, in a sense, and will have powers that are at the will of the Secretary of State. It will be within DSIT, I assume—and so on.
I am afraid that we are going round in circles here. We need to know a great deal more. I hope that we get much more protection for those who have the benefit of the service; otherwise, we will find ourselves in a situation that we are often in as regards the digital world, whereby there is a lack of trust and the public react against what they perceive as somebody taking something away from them. In the health service, for example, 3 million people have opted out from sharing their GP personal health data. I am only saying that we need to be careful in this area and to make sure that we have all the right protections in place. In the meantime, I beg leave to withdraw my amendment.
My Lords, successive Governments have demonstrated their enthusiasm for NUAR. It was quite interesting to hear the Minister’s enthusiasm for the digitisation of the map of the Underground, so to speak; she was no less enthusiastic than her predecessor. However, as the Minister knows, there are tensions between them—the new, bright, shiny NUAR—and LSBUD, or LinesearchbeforeUdig, which in some respects is the incumbent.
I thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.
In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—
For the record, the noble Viscount is getting a vigorous nod from the Minister.
I am grateful to the noble Viscount for joining me in my enthusiasm for NUAR. He is right: having seen it in practice, I am a great enthusiast for it. If it is possible to demonstrate it to other people, I would be very happy to do so, because it is quite a compelling story when you see it in practice.
Amendment 56, in the name of the noble Lord, Lord Clement-Jones, would place a duty on the Secretary of State to consult relevant private sector organisations before implementing the NUAR provisions under the Bill. I want to make clear then that the Geospatial Commission, which oversees NUAR, has been engaging with stakeholders on NUAR since 2018. Since then, there have been extensive reviews of existing processes and data exchange services. That includes a call for evidence, a pilot project, public consultation and numerous workshops. A series of in-person focus groups were completed last week and officials have visited commercial companies with specific concerns, including LinesearchbeforeUdig, so there has been extensive consultation with them.
I suppose one can understand why they feel slightly put out about NUAR appearing on the scene, but NUAR is a huge public asset that we should celebrate. We can potentially use it in other ways for other services in the future, once it is established, and we should celebrate the fact that we have managed to create it as a public asset. I say to the noble Lord, Lord Clement-Jones, that a further consultation on that basis would provide no additional benefit but would delay the realisation of the significant benefits that NUAR could deliver.
Moving on to the noble Lord’s other amendments, Amendments 193, 194, and 195, he is absolutely right about the need for data interoperability in the health service. We can all think of examples of where that would be of benefit to patients and citizens. It is also true that we absolutely need to ensure that our health and care system is supported by robust information standards. Again, we go back to the issue of trust: people need to know that those protections are there.
This is why we would ensure, through Clause 119 and Schedule 15, that suppliers of IT products and services used in the provision of health or adult social care in England are required to meet relevant information standards. In doing so, we can ensure that IT suppliers are held to account where information standards are not implemented. The application of information standards is independent of commercial organisations, and we would hold IT companies to them. Furthermore, the definition of healthcare as set out in the Health and Social Care Act 2012, as amended by the Health and Care Act 2022, already ensures that all forms of healthcare are within scope of information standards, which would include primary care. That was one of the other points that the noble Lord made.
As an add-on to this whole discussion, the noble Lord will know that the Government are preparing the idea of a national data library, which would encourage further interoperability between government departments to make sure that we use it to improve services. Health and social care is the obvious one, but the members of the Committee can all think of all sorts of other ways where government departments, if they collaborated on an interoperable basis, could drive up standards and make life easier for a whole lot of citizens in different ways. We are on the case and are absolutely determined to deliver it. I hope that, on that basis, the noble Lord will withdraw his amendment.
I am sorry to interrupt the Minister, but she has whetted our appetite about the national data library. It is not included in the Bill. We talked about it a little at Second Reading, but I wonder whether she can tell us a little more about what is planned. Is it to be set up on a statutory basis or is it a shadow thing? What substance will it actually have and how?
Well, details of it were in our manifesto, in as much as a manifesto is ever detailed. It is a commitment to deliver cross-departmental government services and create a means whereby some of the GDPR blockages that stop one department speaking to another can, where necessary, be freed up to make sure that people exchange data in a more positive way to improve services. There will be more details coming out. It is a work in progress at the moment and may well require some legislation to underpin it. There is an awful lot of work to be done in making sure that one dataset can talk to another before we can progress in any major way, but we are working at speed to try to get this new system up and running.
I thank the Minister for that, which was very interesting. We were talking about medical health IT and “GDPR blockages” almost has a medical quality to it. The embryonic national data library will obviously get some more mentions as we go through the Bill. It is a work in progress, so I hope that we will know more at the end of the Bill than we did at the beginning.
The Minister talked about datasets talking to each other. We will have to get the noble Viscount, Lord Camrose, to use other phrases, not just “Netflix in the age of Blockbuster” but something equally exciting about datasets talking to each other.
My Lords, of course I welcome the fact that the Bill will enable people to register a death in person and online, which was a key recommendation from the UK Commission on Bereavement. I have been asked to table this amendment by Marie Curie; it is designed to achieve improvements to UK bereavement support services, highlighting the significant administrative burden faced by bereaved individuals.
Marie Curie points to the need for a review of the existing Tell Us Once service and the creation of a universal priority service register to streamline death-related notifications across government and private sectors. It argued that the Bill presents an opportunity to address these issues through improved data-sharing and online death registration. Significant statistics illustrate the scale of the problem, showing a large percentage of bereaved people struggling with numerous administrative tasks. It urges the Government, as I do, to commit to implementing those changes to reduce the burden on bereaved families.
Bereaved people face many practical and administrative responsibilities and tasks after a death, which are often both complex and time sensitive. This Bill presents an opportunity to improve the way in which information is shared between different public and private service providers, reducing the burden of death administration.
When someone dies, the Tell Us Once service informs the various parts of national and local government that need to know. That means the local council stops charging council tax, the DVLA cancels the driving licence, the Passport Office cancels the passport, et cetera. Unfortunately, Tell Us Once is currently not working across all Government departments and does not apply to Northern Ireland. No updated equality impact assessment has ever been undertaken. While there are death notification services in the private sector, they are severely limited by not being a public service programme—and, as a result, there are user costs associated, adding to bereaved people’s financial burden and penalising the most struggling families. There is low public awareness and take-up among all these services, as well as variable and inconsistent provision by the different companies. The fact that there is not one service for all public and private sector notifications means that dealing with the deceased’s affairs is still a long and painful process.
The Bill should be amended to require Ministers to carry out a review into the current operation and effectiveness of the Tell Us Once service, to identify any gaps in its operation and provisions and make recommendations as to how the scope of the service could be expanded. Priority service registers are voluntary schemes which utility companies create to ensure that extra help is available to certain vulnerable customers. The previous Government recognised that the current PSRs are disjointed, resource intensive and duplicative for companies, carrying risks of inconsistencies and can be “burdensome for customers”.
That Government concluded that there is “significant opportunity to improve the efficiencies and delivery of these services”. The Bill is an opportunity for this Government to confirm their commitment to implementing a universal priority services register and delivering any legislative measures required to facilitate it. A universal PSR service must include the interests of bereaved people within its scope, and charitable voluntary organisations such as Marie Curie, which works to support bereaved people, should be consulted in its development.
I have some questions to the Minister. First, what measures does this Bill introduce that will reduce the administrative burden on bereaved people after the death of a loved one? Secondly, the Tell Us Once service was implemented in 2010 and the original equality impact assessment envisaged that its operation should be kept under review to reflect the changing nature of how people engage with public services, but no review has ever happened. Will the Minister therefore commit the Government to undertake a review of Tell Us Once? Thirdly, the previous Government’s Smarter Regulation White Paper committed to taking forward a plan to create a “shared once” support register, which would bring together priority service registers. Will the Minister commit this Government to taking that work forward? I beg to move.
My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.
Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.
I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.
We support this service, of course—we can see the potential for expanding it further if we get this measure right—but I have to tell noble Lords that the current service is not in great shape in terms of its technology. It has suffered from insufficient investment over time and it needs to be improved before we can take it to the next stage of its potential. We consider that the best way to address this issue is, first, to upgrade its legacy technology, which is what we are operating at the moment. I realised that this is a problem only as I took over this brief; I had assumed that it would be more straightforward, but the problem seems to be that we are operating on ancient technology here.
Work is already under way to try to bring it all up to date. We are looking to improve the current service and at the opportunities to extend it to more of government. Our initial task is to try to extend it to some of the government departments that do not recognise it at the moment. Doing that will inform us of the potential limitations and the opportunities should we wish to extend it to the private sector in future. I say to the noble Lord that this will have to be a stage process because of the technological challenges that we currently have.
We are reluctant to commit to a review and further expansion of the service at this time but, once the service is updated, we would absolutely be happy to talk to noble Lords and revisit this issue, because we see the potential of it. The update is expected to be completed in the next two years; I hope that we will be able to come back and give a progress report to noble Lords at that time. However, I have to say, this is what we have inherited—bear with us, because we have a job to do in bringing it up to date. I hope that, on that basis, the noble Lord will withdraw his amendment, albeit reluctantly.
My Lords, I thank the Minister for that response, and I thank the noble Lord, Lord Stevenson—at least, I think I do—for his contribution.
I have clearly worked on far too many Bills in the past. I have to do better when I move amendments like this. I have to bring the full package, but we are allowed to speak for only a quarter of an hour, so we cannot bring everything to the table. All I can promise the noble Viscount is that my avatar will haunt him while he is sitting on the fence.
I thank the Minister for giving a sympathetic response to this, but clearly there are barriers to rolling out anything beyond where we have got to. I was rather disappointed by two years because I was formulating a plan to bring back an Oral Question in about six months’ time. I am afraid that she may find that we are trying to hurry her along a little on this. I recognise that there are technology issues, but convening people and getting broader engagement with various players is something that could be done without the technology in the first instance, so the Minister can expect follow-up on this front rather earlier than two years’ time. She does not have the luxury of waiting around before we come back to her on it, but I thank her because this is a fantastic service. It is limited, but, as far as it goes, it is a godsend for the bereaved. We need to make sure that it improves and fulfils its potential across the private sector as well as the public sector. In the meantime, I beg leave to withdraw my amendment.
(1 month, 1 week ago)
Lords ChamberMy Lords, I draw attention to my AI interests in the register. I thank the Minister for her upbeat introduction to the Bill and all her engagement to date on its contents. It has been a real pleasure listening to so many expert speeches this afternoon. The noble Lord, Lord Bassam, did not quite use the phrase “practice makes perfect”, because, after all, this is the third shot at a data protection Bill over the past few years, but I was really taken by the vision and breadth of so many speeches today. I think we all agree that this Bill is definitely better than its two predecessors, but of course most noble Lords went on to say “but”, and that is exactly my position.
Throughout, we have been reminded of the growing importance of data in the context of AI adoption, particularly in the private and public sectors. I think many of us regret that “protection” is not included in the Bill title, but that should go hand in hand if not with actual AI regulation then at least with an understanding of where we are heading on AI regulation.
Like others, I welcome that the Bill omits many of the proposals from the unlamented Data Protection and Digital Information Bill, which in our view— I expect to see a vigorous shake of the head from the noble Viscount, Lord Camrose—watered down data subject rights. The noble Lord, Lord Bassam, did us a great favour by setting out the list of many of the items that were missing from that Bill.
I welcome the retention of some elements in this Bill, such as the digital registration of birth and deaths. As the noble Lord, Lord Knight, said, and as Marie Curie has asked, will the Government undertake a review of the Tell Us Once service to ensure that it covers all government departments across the UK and is extended to more service providers?
I also welcome some of the new elements, in particular amendments to the Online Safety Act—essentially unfinished business, as far back as our Joint Committee. It was notable that the noble Lord, Lord Bethell, welcomed the paving provisions regarding independent researchers’ access to social media and search services, but there are questions even around the width of that provision. Will this cover research regarding non-criminal misinformation on internet platforms? What protection will researchers conducting public interest research actually receive?
Then there is something that the noble Baroness, Lady Kidron, Ian Russell and many other campaigners have fought for: access for coroners to the data of young children who have passed away. I think that will be a milestone.
The Bill may need further amendment. On these Benches we may well put forward further changes for added child protection, given the current debate over the definition of category 1 services.
There are some regrettable omissions from the previous Bill, such as those extending the soft opt-in that has always existed for commercial organisations to non-commercial organisations, including charities. As we have heard, there are a considerable number of unwelcome retained provisions.
Many noble Lords referred to “recognised legitimate interests”. The Bill introduces to Article 6 of the GDPR a new ground of recognised legitimate interest, which counts as a lawful basis for processing if it meets any of the descriptions in the new Annex 1 to the GDPR in Schedule 4 of the Bill. The Bill essentially qualifies the public interest test under Article 6(1)(e) of the GDPR and, as the noble Lord, Lord Vaux, pointed out, gives the Secretary of State powers to define additional recognised legitimate interests beyond those in the annex. This was queried by the Constitution Committee, and we shall certainly be kicking the tyres on that during Committee. Crucially, there is no requirement for the controller to make any balancing test, as the noble Viscount, Lord Colville, mentioned, taking the data subject’s interests into account. It just needs to meet the grounds in the annex. These provisions diminish data protection and represent a threat to data adequacy, and should be dropped.
Almost every noble Lord raised the changes to Article 22 and automated decision-making. With the exception of sub-paragraph (d), to be inserted by Clause 80, the provisions are very similar to those of the old Clause 14 of the DPDI Bill in limiting the right not to be subject to automated decision-making processing or profiling to special category data. Where automated decision-making is currently broadly prohibited with specific exceptions, the Bill will permit it in all but a limited set of circumstances. The Secretary of State is given the power to redefine what ADM actually is. Again, the noble Viscount, Lord Colville, was right in how he described what the outcome of that will be. Given the Government’s digital transformation agenda in the public sector and the increasing use of AI in the private sector, this means increasing the risk of biased and discriminatory outcomes in ADM systems.
Systems such as HART, which predicted reoffending risk, PredPol, which was used to allocate policing resources based on postcodes, and the gangs matrix, which harvests intelligence, have all been shown to have had discriminatory effects. It was a pleasure to hear what the noble Lord, Lord Arbuthnot, had to say. Have the Government learned nothing from the Horizon scandal? As he said, we need to move urgently to change the burden of proof for computer evidence. What the noble Earl, Lord Errol, said, in reminding us of the childlike learning abilities of AI, was extremely important in that respect. We should not put our trust in that way in the evidence given by these models.
ADM safeguards are critical to public trust in AI, and our citizens need greater not less protection. As the Ada Lovelace Institute says, the safeguards around automated decision-making, which exist only in data protection law, are more critical than ever in ensuring that people understand when a significant decision about them is being automated, why that decision has been made, and the routes to challenge it or ask for it to be decided by a human. The noble Viscount, Lord Colville, and the noble Lord, Lord Holmes, set out that prescription, and I entirely agree with them.
This is a crucial element of the Bill but I will not spend too much time on it because, noble Lords will be very pleased to hear, I have a Private Member’s Bill on this subject, providing much-needed additional safe- guards for ADM in the public sector, coming up on 13 December. I hope noble Lords will be there and that the Government will see the sense of it in the meantime.
We have heard a great deal about research. Clause 68 widens research access to data. There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or because of very narrow distinctions between the original and new purpose. However, it is quite clear that the definition of scientific research introduced by the Bill is too broad and risks abuse by commercial interests. A number of noble Lords raised that, and I entirely agree with the noble Baroness, Lady Kidron, that the Bill opens the door to data reuse and mass data scraping by any data-driven product development under the auspices of scientific research. Subjects cannot make use of their data rights if they do not even know that their data is being processed.
On overseas transfers, I was very grateful to hear what the noble and learned Lord, Lord Thomas, had to say about data adequacy, and the noble Lords, Lord Bethell, Lord Vaux and Lord Russell, also raised this. All of us are concerned about the future of data adequacy, particularly the tensions that are going to be created with the new Administration in the US if there are very different bases for dealing with data transfer between countries.
We have concerns about the national security provisions. I will not go into those in great detail, but why do the Government believe that these clauses are necessary to safeguard national security?
Many noble Lords raised the question of digital verification services. It was very interesting to hear what the noble Earl, Lord Erroll, had to say, given his long-standing interest in this area. We broadly support the provisions, but the Constitution Committee followed the DPRRC in criticising the lack of parliamentary scrutiny of the framework to be set by the Secretary of State or managed by DSIT. How will they interoperate with the digital identity verification services being offered by DSIT within the Government’s One Login programme?
Will the new regulator be independent, ensure effective governance and accountability, monitor compliance, investigate malicious actors and take enforcement action regarding these services? For high levels of trust in digital ID services, we need high-quality governance. As the noble Lord, Lord Vaux, said, we need to be clear about the status of physical ID alongside that. Why is there still no digital identity offence? I entirely agreed with what the noble Lords, Lord Lucas and Lord Arbuthnot, said about the need for factual clarity underlying the documents that will be part of the wallet—so to speak—in terms of digital ID services. It is vital that we distinguish and make sure that both sex and gender are recorded in our key documents.
There are other areas about which we on these Benches have concerns, although I have no time to go through them in great detail. We support the provisions on open banking, which we want to see used and the opportunities properly exploited. However, as the noble Lord, Lord Holmes, said, we need a proper narrative that sells the virtues of open banking. We are concerned that the current design allows landlords to be given access to monitoring the bank accounts of tenants for as long as an open banking approval lasts. Smart data legislation should mandate that the maximum and default access duration be no longer than 24 hours.
A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.
The noble and learned Lord, Lord Thomas, very helpfully spoke about the Government’s ECHR memorandum. I do not need to repeat what he said, but clearly, this could lead to a significant gap, given that the Retained EU Law (Revocation and Reform) Act 2023 has not been altered and is not altered by this Bill.
There are many other aspects to this. The claims for this Bill and these provisions are as extravagant as for the old one; I think the noble Baroness mentioned the figure of £10 billion at the outset. We are in favour of growth and innovation, but how will this Bill also ensure that fundamental rights for the citizen will be enhanced in an increasingly AI-driven world?
We need to build public trust, as the noble Lord, Lord Holmes, and the noble Baroness, Lady Kidron, said, in data sharing and access. To achieve the ambitions of the Sudlow review, there are lessons that need to be learned by the Department of Health and the NHS. We need to deal with edtech, as has been described by a number of noble Lords. All in all, the Government are still not diverging enough from the approach of their predecessor in their enthusiasm for the sharing and use of data across the public and private sectors without the necessary safeguards. We still have major reservations, which I hope the Government will respond to. I look forward—I think—to Grand Committee.
(1 month, 2 weeks ago)
Lords ChamberTo ask His Majesty’s Government, following the recommendation of the Vallance review of the regulation of emerging digital technologies, whether they plan to set out a policy position on the relationship between intellectual property rights and the training of generative AI models.
My Lords, the AI and creative sectors are both essential to our mission to grow the UK economy. Our goal is to find the right balance between fostering innovation in AI while ensuring protection for creators and our vibrant creative industries. This is an important but complex area and we are very aware of the need to resolve the issues. We are working with stakeholders to understand their views and will set out our next steps soon.
My Lords, I thank the Minister for that reply, but the Prime Minister, in a recent letter to the News Media Association, said:
“We recognise the basic principle that publishers should have control over and seek payment for their work, including when thinking about the role of AI”.
Will the Minister therefore agree with the House of Lords Communications and Digital Committee and affirm the rights of copyright owners in relation to their content used for training purposes on large language models? Will she rule out any widening of the text and data-mining exception and include in any future AI legislation a duty on developers to keep records of the material and data used to train their AI models?
My Lords, I pay tribute to the Lords committee that has considered this issue. We are keen to make progress in this area but it is important that we get it right. The previous Government had this on their table for a long time and were not able to resolve it. The Intellectual Property Office, DSIT and DCMS are working together to try to find a way forward that will provide a solution for creative media and the AI sectors. Ministers—my colleagues Chris Bryant and Feryal Clark—held round tables with representatives of the creative industries and the AI sector recently, and we are looking at how we can take this forward to resolve the many issues and questions that the noble Lord has quite rightly posed for me today.
(1 month, 4 weeks ago)
Grand CommitteeMy Lords, this order was laid before the House on 9 September this year. The Online Safety Act lays the foundations of strong protection for children and adults online. I am grateful to noble Lords for their continued interest in the Online Safety Act and its implementation. It is critical that the Act is made fully operational as soon as possible, and the Government are committed to ensuring that its protections are delivered as soon as possible. This statutory instrument will further support the implementation of the Act by Ofcom.
This statutory instrument concerns Ofcom’s ability to share business information with Ministers for the purpose of fulfilling functions under the Online Safety Act 2023, under Section 393 of the Communications Act 2003. This corrects an oversight in the original Online Safety Act that was identified following its passage.
Section 393 of the Communications Act 2003 contains a general restriction on Ofcom disclosing information about particular businesses without consent from the affected businesses, but with exemptions, including where this facilitates Ofcom in carrying out its regulatory functions and facilitates other specified persons in carrying out specific functions. However, this section does not currently enable Ofcom to share information with Ministers for the purpose of fulfilling functions under the Online Safety Act. This means that, were Ofcom to disclose information about businesses to the Secretary of State, it may be in breach of the law.
It is important that a gateway exists for sharing information for these purposes so that the Secretary of State can carry out functions under the Online Safety Act, such as setting the fee threshold for the online safety regime in 2025 or carrying out post-implementation reviews of the Act required under Section 178. This statutory instrument will therefore amend the Communications Act 2003 to allow Ofcom to share information with the Secretary of State and other Ministers, strictly for the purpose of fulfilling functions under the Online Safety Act 2023.
There are strong legislative safeguards and limitations on the disclosure of this information, and Ofcom is experienced in handling confidential and sensitive information obtained from the services it regulates. Ofcom must comply with UK data protection law and would need to show that the processing of any personal data was necessary for a lawful purpose. As a public body, Ofcom is also required to act compatibly with the Article 8 right of privacy under the European Convention on Human Rights.
We will therefore continue to review the Online Safety Act, so that Ofcom is able to support the delivery of functions under the Act where it is appropriate. That is a brief but detailed summary of why this instrument is necessary. I should stress that it contains a technical amendment to deal with a very small legal aspect. Nevertheless, I will be interested to hear noble Lords’ comments on the SI. I beg to move.
My Lords, I thank the Minister for her introduction and for explaining the essence of the SI. We all have a bit of pride of creation in the Online Safety Act; there are one or two of us around today who clearly have a continuing interest in it. This is one of the smaller outcomes of the Act and, as the Minister says, it is an essentially an oversight. I would say that a tidying-up operation is involved here. It is rather gratifying to see that the Communications Act still has such importance, 21 years after it was passed. It is somewhat extraordinary for legislation to be invoked after that period of time in an area such as communications, which is so fast-moving.
My question for the Minister is whether the examples that she gave or which were contained in the Explanatory Memorandum, regarding the need for information to be obtained by the Secretary of State in respect of Section 178, on reviewing the regulatory framework, and Section 86, on the threshold for payment of fees, are exclusive. Are there other aspects of the Online Safety Act where the Secretary of State requires that legislation?
We are always wary of the powers given to Secretaries of State, as the noble Viscount, Lord Camrose, will probably remember to his cost. But at every point, the tyres on legislation need to be kicked to make sure that the Secretary of State has just the powers that they need—and that we do not go further than we need to or have a skeleton Bill, et cetera—so the usual mantra will apply: we want to make sure that the Secretary of State’s powers are proportionate.
It would be very useful to hear from the Minister what other powers are involved. Is it quite a number, were these two just the most plausible or are there six other sets of powers which might not be so attractive? That is the only caveat I would make in this respect.
(1 month, 4 weeks ago)
Grand CommitteeMy Lords, these regulations were laid before the House on 12 September this year. The Government stated in their manifesto that they would
“use every government tool available to target perpetrators and address the root causes of abuse and violence”
in order to achieve their
“landmark mission to halve violence against women and girls in a decade”.
Through this statutory instrument, we are broadening online platforms’ and search engines’ responsibilities for tackling intimate image abuse under the Online Safety Act. More than one in three women have experienced abuse online. The rise in intimate image abuse is not only devastating for victims but also spreads misogyny on social media that can develop into potentially dangerous relationships offline. One in 14 adults in England and Wales has experienced threats to share intimate images, rising to one in seven young women aged 18 to 34.
It is crucial that we tackle these crimes from every angle, including online, and ensure that tech companies step up and play their part. That is why we are laying this statutory instrument. Through it, we will widen online platforms’ and search engines’ obligations to tackle intimate image abuse under the Online Safety Act. As noble Lords will know, the Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and search services to protect their users from harm.
As part of this, the Act gives service providers new “illegal content duties”. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be
“used for the commission or facilitation of a priority offence”.
They then need to take steps to mitigate identified risks. These will include implementing safety-by-design measures to reduce risks and content moderation systems to remove illegal content where it appears.
The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. They are set out in schedules to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activities under their illegal content duties.
The priority offences list currently includes certain intimate image abuse offences. Through this statutory instrument, we are adding new intimate image abuse offences to the priority list. This replaces an old intimate image abuse offence, which has now been repealed. These new offences are in the Sexual Offences Act 2003. They took effect earlier this year. The older offence was in the Criminal Justice and Courts Act 2015. The repealed offence covered sharing intimate images where the intent was to cause distress. The new offences are broader; they criminalise sharing intimate images without having a reasonable belief that the subject would consent to sharing the images. These offences include the sharing of manufactured or manipulated images, including so-called deepfakes.
Since these new offences are more expansive, adding them as priority offences means online platforms will be required to tackle more intimate image abuse on their services. This means that we are broadening the scope of what constitutes illegal intimate image content in the Online Safety Act. It also makes it clear that platforms’ priority illegal content duties extend to AI-generated deepfakes and other manufactured intimate images. This is because the new offences that we are adding explicitly cover this content.
As I have set out above, these changes affect the illegal content duties in the Online Safety Act. They will ensure that tech companies play their part in kicking this content off social media. These are just part of a range of wider protections coming into force next spring through the Online Safety Act that will mean that social media companies have to remove the most harmful illegal content, a lot of which disproportionately affects women and girls, such as through harassment and controlling or coercive behaviour.
Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. It is currently producing this documentation. We anticipate that the new duties will start to be enforced from spring next year once Ofcom has issued these codes of practice and they have come into force. Providers will also need to have done their risk assessment for illegal content by then. We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. We are committed to working with Ofcom to get these protections in place as quickly as possible. We are focused on delivering.
Where companies are not removing and proactively stopping this vile material appearing on their platforms, Ofcom will have robust powers to take enforcement action against them. This includes imposing fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is highest.
In conclusion, through this statutory instrument we are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to this harmful kind of illegal content, including where it has been manufactured or manipulated. I hope noble Lords will commend these further steps that we have taken that take the provisions in the Online Safety Act a useful further step forward. I commend these regulations to the Committee, and I beg to move.
My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.
My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.
The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.
If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.
(2 months, 1 week ago)
Lords ChamberThe noble Lord is absolutely right. The scale of violent images featuring women and girls in our country is intolerable, and this Government will treat it as the national emergency it is. The noble Lord will be pleased to hear that the Government have set out an unprecedented mission to halve violence against women and girls within a decade. We are using every government tool we have to target the perpetrators and address the root cause of violence. That involves many legislative and non-legislative measures, as the noble Lord will appreciate, including tackling the education issue. However, ultimately, we have to make sure that the legislation is robust and that we take action, which we intend to do.
My Lords, as the Minister and others have mentioned, there is considerable and increasing concern about deepfake pornographic material, particularly the so-called nudification apps, which can be easily accessed by users of any age. What action will the Government be taking against this unacceptable technology, and will an offence be included in the forthcoming crime and policing Bill?
The noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.