(5 days, 19 hours ago)
Grand CommitteeMy Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.
The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.
The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.
I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.
I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.
Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.
Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.
I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.
My Lords, I rise to make a brief but emphatic comment from the health constituency. We in the NHS have been victims of appalling cyber- hacking. The pathology labs in south London were hacked and that cost many lives. It is an example of where the world is going in the future unless we act promptly. The emphatic call for quick action so that government keeps up with world changes is really well made. I ask the Minister to reflect on that.
My Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.
Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.
My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.
I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.
I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.
I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.
Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.
Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.
Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.
Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.
My Lords, the UK is a world leader in genomics research. This research will no doubt result in many benefits, particularly in the healthcare space. However, genomics data can be, and increasingly is, exploited for deeply concerning purposes, including geostrategic ones.
Western intelligence agencies are reportedly becoming increasingly concerned about China using genomic data and biotechnology for military purposes. The Chinese Government have made it clear that genomics plays a key part in the civilian-military doctrine. The 13th five-year plan for military-civil fusion calls for the cross-pollination of military and civilian technology such as biotechnology. This statement, taken in conjunction with reports that the Beijing Genomics Institute—the BGI—in collaboration with the People’s Liberation Army, is looking to make ethnically Han Chinese soldiers less susceptible to altitude sickness, makes for worrying reading. Genetically engineered soldiers appear to be moving out of fiction and towards reality.
The global genomics industry has grown substantially as a result of the Covid-19 pandemic and gene giant BGI Group and its affiliated MGI Tech have acquired large databases of DNA. Further, I note that BGI has widespread links to the Chinese state. It operates the Government’s key laboratories and national gene bank, itself a vast repository of DNA data drawn from all over the world. A Reuters investigation found that a prenatal test, NIFTY, sold by BGI to expectant mothers, gathered millions of women’s DNA data. This prenatal test was developed in collaboration with the Chinese military.
For these reasons, I think we must become far more protective of genomic data gathered from our population. While many researchers use genomic data to find cures for terrible diseases, many others, I am afraid, would use it to do us harm. To this end, I have tabled Amendment 199 to require the Secretary of State and the Information Commissioner to conduct frequent risk assessments on data privacy associated with genomics and DNA companies headquartered in countries that are systemic competitors or hostile actors. I believe this will go some way to preventing genomic data transfer out of the UK and to countries such as China that may use it for military purposes. I beg to move.
My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.
My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.
I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.
I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.
The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.
Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.
Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.
The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.
On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.
I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.
(1 week ago)
Grand CommitteeMy Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.
I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.
Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.
I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.
Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of
“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”
without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.
My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.
The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.
I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.
Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.
Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.
Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.
This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.
Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.
This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.
My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?
Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:
“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,
based on analysis of 13.1 million donors by the Salocin Group. The letter continues:
“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.
I hope that the Government will listen to the DMA and the charities involved.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.
First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.
Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.
My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.
My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.
The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.
Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.
I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.
I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.
On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.
These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.
I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.
The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.
As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.
On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.
Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.
My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.
I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.
A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.
This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.
I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.
I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.
I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.
Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:
“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.
I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.
Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.
The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.
Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.
On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.
On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.
I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
These four technical government amendments do not, we believe, have a material policy effect but will improve the clarity and operation of the Bill text.
Amendment 133 amends Section 199 of the Investigatory Powers Act 2016, which provides a definition of “personal data” for the purposes of bulk personal datasets. This definition cross-refers to Section 82(1) of the Data Protection Act 2018, which is amended by Clauses 88 and 89 of the Bill, providing for joint processing by the intelligence services and competent authorities. This amendment will retain the effect of that cross-reference to ensure that processing referred to in Section 199 of the IPA remains that done by an intelligence service.
Amendment 136 concerns Clause 92 and ICO codes of practice. Clause 92 establishes a new procedure for panels to consider ICO codes of practice before they are finalised. It includes a regulation-making power for the Secretary of State to disapply or modify that procedure for particular codes or amendments to them. Amendment 136 will enable the power to be used to disapply or modify the panel’s procedure for specific amendments or types of amendments to a code, rather than for all amendments to it.
Finally, Amendments 213 and 214 will allow for changes made to certain immigration legislation and the Online Safety Act 2023 by Clauses 55, 122 and 123 to be extended via existing powers in those Acts, exercisable by Orders in Council, to Guernsey and the Isle of Man, should they seek this.
I beg to move.
My Lords, I will keep my comments brief as these are all technical amendments to the Bill. I understand that Amendments 133 and 136 are necessary for the functioning of the law and therefore have no objection. As for Amendment 213, extending immigration legislation amended by Clause 55 of this Bill to the Bailiwick of Guernsey or the Isle of Man, this is a sensible measure. The same can be said for Amendment 214, which extends the provision of the Online Safety Act 2023, amended by this Bill, to the Bailiwick of Guernsey or the Isle of Man.
My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.
Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.
Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.
On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.
I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.
I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?
I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.
Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.
I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.
I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.
(1 week, 3 days ago)
Lords ChamberMy Lords, of course I must start by joining others in thanking the noble Lord, Lord Clement-Jones, for bringing forward this timely and important Bill, with whose aims we on these Benches strongly agree. As public bodies take ever more advantage of new technological possibilities, surely nothing is more critical than ensuring that they do so in a way that adheres to principles of fairness, transparency and accountability.
It was also particularly helpful to hear from the noble Lord the wide range of very specific examples of the problems caused by what I will call AADM for brevity. I felt that they really brought it to life. I also take on board the point made by the noble Lord, Lord Knight, about hiring and firing by AADM. The way this is done is incredibly damaging and, frankly, if I may say so, too often simply boneheaded.
The point by the noble Baroness, Lady Lane-Fox, about procurement is absolutely well founded: I could not agree more strongly that this is a crucial area for improvement. That point was well supported by the noble Baroness, Lady Freeman of Steventon, as well. I thought that the argument, powerful as ever, from noble Lord, Lord Tarassenko, for sovereign AI capabilities was also particularly useful, and I hope that the Government will consider how to take that forward. Finally, I really welcomed the point made so eloquently by the noble Baroness, Lady Hamwee, in reminding us that just the existence of a human in the loop is a completely insufficient condition for making these things effective.
We strongly support the goal of this Bill: to ensure trustworthy AI that deserves public confidence, fosters innovation and contributes to economic growth. However, the approach proposed, raises—for me, anyway—several concerns that I worry could hinder its effectiveness.
First, definition is a problem. Clause 2(1) refers to “any algorithmic … systems” but, of course, “algorithmic” can have a very broad definition: it can encompass any process, even processes that are unrelated to digital or computational systems. While the exemptions in subsections (2) and (4) are noted, did the noble Lord give consideration to adopting or incorporating the AI White Paper’s definition around autonomy and adaptiveness, or perhaps just the definition around AADM used in the DUA Bill, which we will no doubt be discussing much more on Monday? We feel that improving the definition would provide some clarity and better align the scope with the Bill’s purpose.
I also worry that the Bill fails to address the rapid pace of AI development. For instance, I worry that requiring ongoing assessments for every update under Clause 3(3) is impractical, given that systems often change daily. This obligation should be restricted to significant changes, thereby ensuring that resources are spent where they matter most.
I worry, too, about the administrative burden that the Bill may create. For example, Clause 2(1) demands a detailed assessment even before a system is purchased. I feel that that is unrealistic, particularly with pilot projects that may operate in a controlled way but in a production environment, not in a test environment as described in Clause 2(2)(b). Would that potentially risk stifling exploration and innovation, and, indeed, slowing procurement within the public sector?
Another area of concern is communication. It is so important that AI gains public trust and that people come to understand the systems and the safeguards in place around them. I feel that the Bill should place greater emphasis on explaining decisions to the general public in ways that they can understand rapidly, so that we can ensure that transparency is not only achieved but perceived.
Finally, the Bill is very prescriptive in nature, and I worry that such prescriptiveness ends up being ineffective. Would it be a more effective approach, I wonder, to require public bodies to have due regard for the five principles of AI outlined in the White Paper, allowing them the flexibility to determine how best to meet those standards, but in ways that take account of the wildly differing needs, approaches and staffing of the public bodies themselves? Tools such as the ATRS could obviously be made available to assist, but I feel that public bodies should have the agency to find the most effective solutions for their own circumstances.
Let me finish with three questions for the Minister. First, given the rapid pace of tech change, what consideration will be given to ensure that public authorities can remain agile and responsive, while continuing to meet its requirements? Secondly, the five principles of AI set out in the White Paper by the previous Government offer a strong foundation for guiding public bodies. Will the Minister consider whether allowing flexibility in how these principles are observed might achieve the Bill’s goals, while reducing the administrative burdens and encouraging innovation? Thirdly, what measures will be considered to build public trust in AI systems, ensuring that the public understand both the decisions made and the safeguards in place around them?
(1 week, 6 days ago)
Grand CommitteeI start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.
Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.
However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.
On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.
On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.
Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:
“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,
may be a more adaptive solution.
Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.
I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.
Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.
Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?
Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.
My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.
We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.
A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.
Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.
Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to
“be subject to the approval of an independent ethics committee”.
Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.
We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.
Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?
Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.
Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.
Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.
I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.
However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.
Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.
Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.
My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?
I thank in particular the noble Lord, Lord Clement-Jones, who has clearly had his Weetabix this morning. I will comment on some of the many amendments tabled.
On Amendments 73, 75, 76, 77, 83 and 90, I agree it is concerning that the Secretary of State can amend such important legislation via secondary legislation. However, these amendments are subject to the affirmative procedure and, therefore, to parliamentary scrutiny. Since the DPDI Bill proposed the same, I have not changed my views; I remain content that this is the right level of oversight and that these changes do not need to be made via primary legislation.
As for Amendment 74, preventing personal health data from being considered a legitimate interest seems wise. It is best to err on the side of caution when it comes to sharing personal health data.
Amendment 77 poses an interesting suggestion, allowing businesses affiliated by contract to be treated in the same way as large businesses that handle data from multiple companies in a group. This would certainly be beneficial for SMEs collaborating on a larger project. However, each such business may have different data protection structures and terms of use. Therefore, while this idea certainly has merit, I am a little concerned that it may benefit from some refining to ensure that the data flows between businesses in a way to which the data subject has consented.
On Amendment 78A and Schedule 4 standing part, there are many good, legitimate interest reasons why data must be quickly shared and processed, many of which are set out in Schedule 4: for example, national security, emergencies, crimes and safeguarding. This schedule should therefore be included in the Bill to set out the details on these important areas of legitimate interest processing. Amendment 84 feels rather like the central theme of all our deliberations thus far today, so I will listen with great interest, as ever, to the Minister’s response.
I have some concerns about Amendment 85, especially the use of the word “publicly”. The information that may be processed for the purposes of safeguarding vulnerable individuals is likely to be deeply sensitive and should not be publicly available. Following on from this point, I am curious to hear the Minister’s response to Amendment 86. It certainly seems logical that provisions should be in place so that individuals can regain control of their personal data should the reason for their vulnerability be resolved. As for the remaining stand part notices in this group, I do not feel that these schedules should be removed because they set out important detail on which we will come to rely.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.
(2 weeks, 6 days ago)
Grand CommitteeMy Lords, I start by reflecting on the strangeness of the situation—to me, anyway. Here we all are again, in slightly different seats but with a largely similar Bill. As I said at Second Reading, we welcome this important Bill; it is absolutely crucial to get our data economy right. We have a number of amendments to the Bill, a great many of which are probing. The overall theme of our amendments is how to make the Bill maximally effective at the important job that it sets out to do.
The terminology of data law is well understood. Lawmakers, lawyers, businesses and data subjects are all to some extent familiar with the terminology. A “controller” means
“the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data”.
A “processor” means
“a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”.
We are all familiar with those terms.
In this Bill, new terms are introduced, named “data holder” and “trader”. A data holder, in relation to customer data or business data of a trader is the trader, or
“a person who, in the course of a business, processes the data”.
How is that materially different from a processor? A trader is described as a person who supplies or provides
“goods, services or digital content”
in the course of business, whether personally, through someone acting in the trader’s name, or on the trader’s behalf. Again, I ask how that is different from a controller.
While I grant that this may seem a very small point in a very large Bill, already data regulations are relatively poorly understood and difficult to follow. Therefore, surely there is no real need to make them more complex by introducing overlapping terms just for this one section of the Bill. As I explained in our explanatory note, this is a probing amendment, and I hope the Minister will be able to explain why these terms are materially different from the existing terms, why they are necessary and so on. If so, I would of course be happy to withdraw my amendment. I beg to move.
Just to follow on from that, I very much support my noble friend’s words. The only reason I can see why you would introduce new definitions is that there are new responsibilities that are different, and you would want people to be aware of the new rules that have been placed on them. I will be interested to hear the Minister’s answer. If that is the case, we can set that out and understand whether the differences are so big that you need a whole new category, as my noble friend said.
Having run lots of small businesses myself, I am aware that, with every new definition that you add, you add a whole new set of rules and complications. As a business owner, how am I going to find out what applies to me and how I am to be responsible? The terms trader, controller, data holder and processor all sound fairly similar, so how will I understand what applies to me and what does not? To the other point that my noble friend made, the more confusing it gets, the less likelihood there is that people will understand the process.
First, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.
On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.
In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.
I thank the Minister for that explanation. I see the point she makes that, in existing legislation, these terms are used. I wonder whether there is anything we can do better to explain the terms. There seems to be significant overlap between processors, holders, owners and traders. The more we can do to clarify absolutely, with great rigour, what those terms mean, the more we will bring clarity and simplicity to this necessarily complex body of law.
I thank the Minister for explaining the rationale. I am satisfied that, although it may not be the most elegant outcome, for the time being, in the absence of a change to the 2015 Act that she references, we will probably have to grin and bear it. I beg leave to withdraw the amendment.
My Lords, Amendments 3, 4 and 20 seek to probe the Government’s position on the roles of the Secretary of State and the Treasury. Amendment 6 seeks to probe whether the Treasury or the Secretary of State shall have precedence when making regulations under this Bill.
Clarity over decision-making powers is critical to good governance, in particular over who has final decision rights and in what circumstances. Throughout Part 1 of the Bill, the Secretary of State and the Treasury are both given regulation-making powers, often on the same matter. Our concern is that having two separate Ministers and two departments responsible for making the same regulations is likely to cause problems. What happens if and when the departments have a difference of opinion on what these regulations should contain or achieve? Who is the senior partner in the relationship? When it comes to putting statute on paper, who has the final say, the Secretary of State or the Treasury?
All the amendments are probing and, at this point, simply seek greater clarification from the Government. If the Minister can explain why two departments are jointly responsible for the same regulations, why this is necessary and a good idea, and what provisions will be in place to avoid legislative confusion, I will be happy not to press the amendments.
The amendments in group 2 cover smart data and relate to the Secretary of State and the Treasury. Apart from the financial services sector clauses, most of the powers in Part 1, as well as the statutory spending authority in Clause 13, are imposed on the Secretary of State and the Treasury. That is the point that the noble Viscount made. These allow the relevant government departments to make smart data regulations. Powers are conferred on the Treasury as the department responsible for financial services, given the Government’s commitment to open banking and open financing. There is no precedence between the Secretary of State or the Treasury when using these powers, as regulations are likely to be made by the department responsible for the sector to which the smart data scheme applies, following, as with other regulations, the appropriate cross-government write-round and collective agreement procedures. I add that interdepartmental discussions are overseen by the Smart Data Council, which will give advice on this issue.
The noble Viscount raises concerns relating to Clause 13. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance, as a matter of regularity. It is for these reasons that I urge the noble Viscount not to press these amendments. These are standard procedures where the Treasury is involved and that is why more than one department is referenced.
I thank the Minister for that explanation. I am pleased to hear that these are standard procedures. Will she put that in writing, in a letter to me, explaining and setting it out so that we have it on the record? It is really important to understand where the decisions break down and to have a single point of accountability for all such decisions and, if it cannot be in the Bill, it could at least be explained elsewhere. Otherwise, I am happy to proceed with the explanation that she has kindly given.
I thank my noble friends Lord Lucas and Lord Arbuthnot for their Amendments 5, 34, 48, 200 and 202. They and other noble Lords who have spoken have powerfully raised some crucial issues in these amendments.
Amendment 5 addresses a key gap, and I take on board what my noble friend Lord Markham said, in how we manage and use customer data in specific contexts. At its heart, it seeks to enable effective communication between organisations holding customer data and customers themselves. The ability to communicate directly with individuals in a specified manner is vital for various practical reasons, from regulatory compliance to research purposes.
One clear example of where this amendment would be crucial is in the context of the Student Loans Company. Through this amendment, the Secretary of State could require the SLC to communicate with students for important purposes, such as conducting research into the outcomes of courses funded by loans. For instance, by reaching out to students who have completed their courses, the SLC could gather valuable insights into how those qualifications have impacted on their employment prospects, income levels or career trajectories. This is the kind of research that could help shape future educational policies, ensuring that loan schemes are working as intended and that the investments made in students’ education are yielding tangible benefits. This, in turn, would allow for better decision-making on future student loans funding and educational opportunities.
Amendment 34 from my noble friend Lord Arbuthnot proposes a welcome addition to the existing clause, specifically aiming to ensure that public authorities responsible for ascertaining key personal information about individuals are reliable in their verification processes and provide clear, accurate metadata on that information. This amendment addresses the essential issue of trust and reliability in the digital verification process. We increasingly rely on digital systems to confirm identity, and for these systems to be effective, we have to make sure that the core information they are verifying is accurate and consistent. If individuals’ key identifying details—date of birth, place of birth and, as we heard very powerfully, sex at birth—are not consistently or accurately recorded across various official databases, it undermines the integrity of the digital verification process. It is important that we have consistency across the public authorities listed in this amendment. By assessing whether these bodies are accurately verifying and maintaining this data, we can ensure uniformity in the information they provide. This consistency is essential for establishing a reliable foundation for digital verification.
When we consider the range of public services that rely on personal identification information, from the NHS and His Majesty’s Revenue and Customs to the Home Office, they are all responsible for verifying identity in some capacity. The amendment would ensure that the data they are using is robust, accurate and standardised, creating smoother interactions for individuals seeking public services. It reduces the likelihood of discrepancies that delay or prevent access to public services.
Amendment 48 would introduce important protections for the privacy and integrity of personal information disclosed by public authorities. In our increasingly digital world, data privacy has become one of the most pressing concerns for individuals and for society. By requiring public authorities to attest to the accuracy, integrity and clarity of the data they disclose, the amendment would help to protect the privacy of individuals and ensure that their personal information was handled with the proper care and respect.
My noble friend Lord Lucas’s Amendment 200 would introduce a data dictionary. It would allow the Secretary of State to establish regulations defining key terms used in digital verification services, birth and death registers, and public data more generally. I heard clearly the powerful arguments about sex and gender, but I come at the issue of data dictionaries from the angle of the efficiency, effectiveness and reusability of the data that these systems generate. The more that we have a data dictionary defining the metadata, the more we will benefit from the data used, whichever of these bodies generates the data itself. I am supportive of the requirement to use a data dictionary to provide standardised definitions in order to avoid confusion and ensure that data used in government services is accurate, reliable and consistent. The use of the negative resolution procedure would ensure that Parliament had oversight while allowing for the efficient implementation of these definitions.
Amendment 202 would create a national register for school admissions rules and outcomes in England. This would be a crucial step towards increasing transparency and ensuring fairness in the school admissions process, which affects the lives of millions of families every year. We want to ensure that navigating the school admissions system is not overly opaque and too complex a process for many parents. With different schools following different rules, criteria and procedures, it can, as my noble friend, Lord Lucas, pointed out, be difficult for families to know what to expect or how best to make informed decisions. The uncertainty can be especially challenging for those who are new to the system, those who face language barriers or those in areas where the school’s rules are not readily accessible or clear.
For many parents, particularly those in areas with complex school systems or scarce school places, access to clear, consistent information can make all the difference. This amendment would allow parents to see exactly how the school admissions process works and whether they were likely to secure a place at their preferred school. By laying out the rules in advance, the system would ensure that parents could make better informed decisions about which schools to apply to, based on criteria such as proximity, siblings or academic performance.
We want to ensure that parents understand how decisions are made and whether schools are adhering to the rules fairly. By requiring all schools to publish their admissions rules and the outcomes of their admissions process, the amendment would introduce a level of accountability. I join other noble Lords in strongly supporting this amendment, as it would create a more effective and efficient school admissions system that works for everyone.
My Lords, we have had a good and wide-ranging discussion on all this. I will try to deal with the issues as they were raised.
I thank the noble Lord, Lord Lucas, for the proposed Amendment 5 to Clause 2. I am pleased to confirm that the powers under Clauses 2 and 4 can already be used to provide customer data to customers or third parties authorised by them, and for the publication or disclosure of wider data about the goods or services that the supplier provides. The powers provide flexibility as to when and how the data may be provided or published, which was in part the point that the noble Viscount, Lord Camrose, was making. The powers may also be used to require the collection and retention of specific data, including to require new data to be gathered by data holders so that this data may be made available to customers and third parties specified by regulations.
I note in particular the noble Lord’s interest in the potential uses of these powers for the Student Loans Company. It would be for the Department for Education to consider whether the use of the smart data powers in Part 1 of the Bill may be beneficial in the context of providing information about student loans and to consult appropriately if so, rather than to specify it at this stage in the Bill. I hope the noble Lord will consider those points and how it can best be pursued with that department in mind.
On Amendments 34, 48 and 200, the Government believe that recording, storing and sharing accurate data is essential to deliver services that meet citizens’ needs. Public sector data about sex and gender is collected based on user needs for data and any applicable legislation. As noble Lords have said, definitions and concepts of sex and gender differ.
Amendment 48 would require that any information shared must be accurate, trusted and accompanied by meta data. Depending on the noble Lord’s intentions here, this could either duplicate existing protections under data protection legislation or, potentially, conflict with them and other legal obligations.
The measures in Part 2 of the Bill are intended to secure the reliability of the process by which citizens verify their data. It is not intended to create new ways to determine a person’s sex or gender but rather to allow people to digitally verify the facts about themselves based on documents that already exist. It worries me that, if noble Lords pursued their arguments, we could end up with a passport saying one thing and a digital record saying something different. We have to go back to the original source documents, such as passports and birth certificates, and rely on them for accuracy, which would then feed into the digital record—otherwise, as I say, we could end up pointing in two different directions.
I reassure the noble Lord, Lord Arbuthnot, that my colleague, Minister Clark, is due to meet Sex Matters this week to discuss digital verification services. Obviously, I am happy to encourage that discussion. However, to prescribe where public authorities can usefully verify “sex at birth”, as noble Lords now propose, extends well beyond the scope of the measures in the Bill, so I ask them to reflect on that and whether this is the right place to pursue those issues.
In addition, the Government recently received the final report of the Sullivan review of data, statistics and research on sex and gender, which explores some of these matters in detail. These matters are more appropriately considered holistically—for example, in the context of that report—rather than by a piecemeal approach, which is what is being proposed here. We are currently considering our response to that report. I hope noble Lords will consider that point as they consider their amendments; this is already being debated and considered elsewhere.
Amendment 202 seeks to create a national register of individual school admissions arrangements and outcomes, which can be used to provide information to parents to help them understand their chances of securing a place at their local school. I agree with the noble Lord that choosing a school for their child is one of the most important decisions that a parent can make. That is why admissions authorities are required to publish admission arrangements on their schools’ websites. They must also provide information to enable local authorities to publish an annual admissions prospectus for parents, including admissions arrangements and outcomes for all state schools in their area.
I refer the noble Lord, Lord Lucas, to the School Information (England) Regulations 2008, which require admission authorities and local authorities to publish prescribed information relating to admissions. Those protections are already built into the legislation, and if a local authority is not complying with that, there are ways of pursuing it. We believe that the existing approach is proportionate, reflects the diversity of admissions arrangements and local circumstances, and is not overly burdensome on schools or local authorities, while still enabling parents to have the information they need about their local schools.
I hope that, for all the reasons I have outlined, noble Lords will be prepared not to press their amendments.
My Lords, I am delighted that the Government have chosen to take forward the smart data schemes from the DPDI Bill. The ability seamlessly to harness and use data is worth billions to the UK economy. However, data sharing and the profit that it generates must be balanced against proper oversight.
Let me start by offering strong support to my noble friend Lord Arbuthnot’s Amendment 7. Personally, I would greatly welcome a more sophisticated and widespread insurance market for cyber protections. Such a market would be based on openly shared data; the widespread publication of that data, as set out in the amendment, could help to bring this about.
I also support in principle Amendments 8 and 10 in the name of the noble Lord, Lord Clement-Jones, because, as I set out on the previous group, there is real and inherent value in interoperability. However, I wonder whether the noble Lord might reconsider the term “machine readable” and change it to something— I do not think that I have solved it—a bit more like “digitally interoperable”. I just worry that, in practice, everything is machine-readable today and the term might become obsolete. I am keen to hear the Minister’s response to his very interesting Amendment 31 on the compulsion of any person to provide data.
I turn to the amendments in my name. Amendment 16 would insert an appeals mechanism by which a person is charged a fee under subsection (1). It is quite reasonable that persons listed under subsection (2)—that is, data holders, decision-makers, interface bodies, enforcers and others with duties or powers under these regulations —may charge a fee for the purposes of meeting the expenses they incur, performing duties or exercising powers imposed by regulations made under this part. However, there should be an appeals mechanism so that, in the event that a person is charged an unreasonable fee, they have a means of recourse.
Amendment 17 is a probing amendment intended to explore the rate at which interest accrues on money owed to specific public authorities for unpaid levies. Given that this interest will be mandated by law, do the Government intend to monitor the levels and, if so, how?
Amendment 18 is a probing amendment designed to explore how the Government intend to deal with a situation when a person listed under subsection (2) of this clause believes they have been charged a levy wrongly. Again, it is reasonable that an appeals mechanism be created, and this would ensure that those who considered themselves to have been wrongly charged have a means of recourse.
Amendment 19 is looking for clarification on how the Government envisage unpaid levies being recovered. I would be grateful if the Minister could set out some further detail on that matter.
Amendment 21 is a probing amendment. I am curious to know the maximum value of financial assistance that the Government would allow the Secretary of State or the Treasury to give to persons under Clause 13. I do not think it would be prudent for the Government to become a financial backstop for participants in smart data schemes, so on what basis is that maximum going to be calculated?
Amendment 22 follows on from those concerns and looks to ensure that there is parliamentary oversight of any assistance provided. I am most curious to hear the Minister’s comments on this matter.
Amendment 23 is a straightforward—I think—amendment to the wording. I feel that the phrase “reasonably possible” seems to open the door to almost limitless endeavours and therefore suggest replacing it with “reasonably practicable”.
On Amendment 25, easy access to the FCA’s policy regarding penalties and levies is important. That would allow oversight, not only parliamentary but by those who are directly or indirectly affected by decisions taken under this policy. I therefore believe the amendment is necessary, as a website is the most accessible location for that information. Furthermore, regular review is necessary to ensure that the policy is functioning and serving its purpose.
Amendments 26 and 27 return to the matter of an appeals process. I will not repeat myself too much, but it is important to be able to appeal penalties and to create a route by which individuals understand how they can go about doing so.
Amendment 28 would ensure that, when the Secretary of State and the Treasury review the regulations made under Part 1 of the Bill, they do so concurrently. This amendment would prevent separate reviews being conducted that may contradict each other or be published at different times; it would force the relevant departments to produce one review and to produce it together. This would be prudent. It would prevent the Government doing the same work twice, unnecessarily spending public money, and would prevent contradicting reviews, which may cause confusion and financial costs in the smart data scheme industry.
Lastly, Amendment 29, which would ensure that Section 10 of this part was subject to the affirmative procedure, would allow for parliamentary oversight of regulations made under this clause.
We are pleased that the Government have chosen to bring smart data schemes forward, but I hope the Minister can take my concerns on board and share with us some of the detail in her response.
My Lords, we have had a detailed discussion, and it may be that I will not be able to pick up all the points that noble Lords have raised. If I do not, I guarantee to write to people.
First, I want to pick up the issues raised by the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, about cybersecurity and cyber resilience. This Government, like previous Governments, take this issue hugely seriously. It is built into all our thinking. The noble Lord, and the noble Baroness in particular, will know that the advice we get on all these issues is top class. The Government are already committed to producing a cybersecurity and resilience Bill within this Parliament. We have all these things in hand, and that will underpin a lot of the protections that we are going to have in this Bill and others. I agree with noble Lords that this is a hugely important issue.
I am pleased to confirm that Clause 3(7) allows the regulations to impose requirements on third-party recipients in relation to the processing of data, which will include security-related requirements. So it is already in the Bill, but I assure noble Lords that it will be underpinned, as I say, by other legislation that we are bringing forward.
In relation to Amendments 8 and 10, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provision about the providing or publishing of business data and the format in which that must be provided. That may include relevant energy-related data. The noble Lord gave some very good examples about how useful those connections and that data could be; he was quite right to raise those issues.
Regarding Amendment 9, in the name of the noble Lord, Lord Clement-Jones, I am pleased to confirm that there is nothing to prevent regulations requiring the provision of business data to government departments, publicly owned bodies and local and regional authorities. This is possible through Clause 4(1)(b), which allows regulations to require provision of business data to a person of a specified description. I hope the noble Lord will look at those cross-references and be satisfied by them.
Noble Lords spoke about the importance of sensitive information in future smart data schemes. A smart data scheme about legal services is not currently under consideration. Having said that, the Government would have regard to the appropriateness of such a scheme and the nature of any data involved and would consult the sector and any other appropriate stakeholders if that was being considered. It is not at the top of our list of priorities, but the noble Lord might be able to persuade us that it would have some merit, and we could start a consultation based on that.
Amendments 16 to 22 consider fees and the safeguards applying to them, which were raised by the noble Viscount. Fees and levies, enabled by Clauses 11 and 12, are an essential mechanism to fund a smart data scheme. The Government consider that appropriate and proportionate statutory safeguards are already built in. For example, requirements in Clause 11(3) and Clause 12(2) circumscribe the expenses in relation to which fees or the levy may be charged, and the persons on whom they may be charged.
Capping the interest rate for unpaid money, which is one of the noble Viscount’s proposals, would leave a significant risk of circumstances in which it might be financially advantageous to pay the levy late. The Government anticipate that regulations would provide an appropriate mechanism to ensure payment of an amount that is reasonable in the context of a late payment that is proposed. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance as a matter of regularity.
Amendments 23 to 27 deal with the clauses relating to the FCA. Clause 15(3) is drafted to be consistent with the wording of established legislation which confers powers on the FCA, most notably the Financial Services and Markets Act 2000. Section 1B of that Act uses the same formulation, using the phrase
“so far as is reasonably possible”
in relation to the FCA’s general duties. This wording is established and well understood by both the FCA and the financial services sector as it applies to the FCA’s strategic and operational objectives. Any deviation from it could create uncertainty and inconsistency.
Amendment 24 would cause significant disruption to current data-sharing arrangements and fintech businesses. Reauthenticating this frequently with every data holder would add considerable friction to open banking services and greatly reduce the user experience—which was the point raised by the noble Lord, Lord Clement-Jones. For example, it is in the customer’s interest to give ongoing consent to a fintech app to provide them with real-time financial advice that might adapt to daily changes in their finances.
Many SMEs provide ongoing access to their bank accounts in order to receive efficient cloud accounting services. If they had to re-register frequently, that would undermine the basis and operability of some of those services. It could inhibit the adoption and viability of open banking, which would defeat one of the main purposes of the Bill.
My Lords, this sequence of amendments is concerned with the publication and availability of guidance. Decision-makers are individuals responsible for deciding if a person has satisfied the conditions for authorisation to receive customer or business data. They may publish guidance on how they intend to exercise their functions. Given the nature of these responsibilities, these individuals are deciding who can receive information pertaining to individuals and businesses. The guidelines which set out how decisions are taken should be easily accessible and the best place for this is on their websites.
Following on from this point, Amendment 12 would require this guidance to be reviewed annually and any changes to be published, again on decision-makers’ websites, at least 28 days before coming into effect. This would ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.
Amendments 13 and 14 seek to create similar requirements for enforcers—that is, a public authority authorised to carry out monitoring or enforcement of regulations under this part. Again, given the nature of these responsibilities, the guidelines should be easily accessible on the enforcer’s website and reviewed annually, with any changes published, again on their website, at least 28 days before coming into effect. This will, once again, ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.
Finally, Amendment 15 would require the Secretary of State or the Treasury to provide guidance on who may be charged a fee under Clause 6(1) and to review it annually. Ensuring the regular review of guidelines will ensure their effectiveness, and the ready availability of guidelines will ensure that they are used and observed. I therefore believe that these amendments will be of benefit to the functioning of the Bill and should be given consideration by the Minister.
My Lords, I thank the noble Viscount, Lord Camrose, for those amendments. I will cover the final group of amendments to Part 1, dealing with smart data guidance.
On Amendments 11, 12, 13 and 14, which relate to the publishing of the guidelines, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provisions about the providing or publishing of business data. This includes the location where they should be published, including, as the noble Viscount suggests, the website of the responsible person.
Furthermore, Clause 21 clarifies that regulation may make provision about the form and manner in which things must be done. That provision can be used to establish appropriate processes around the sharing of information and guidance, including its regular update, publication and sharing with the relevant person.
Amendment 15 refers to the amount of fee charged and how it should be determined. The power is already broad enough to allow the information to be reviewed as and when necessary, but to mandate that the review must take place at least once a year may be a bit restrictive. For these reasons, I ask the noble Viscount not to press his amendments.
I thank the noble Lord for his answers. I understand what he says, although I would be grateful if either he or the noble Baroness, Lady Jones, could summarise those points in writing because I did not quite capture them all. If I understand correctly, all the concerns that we have raised are dealt with in other areas of the Bill, but if they could write to me then that would be great. I beg leave to withdraw the amendment.
In an act that I hope he is going to repeat throughout, the noble Lord, Lord Clement-Jones, has fully explained all the amendments that I want to support, so I put on record that I agree fully with all the points he made. I want to add just one or two other points. They are mainly in the form of questions for the Minister.
Some users are more vulnerable to harms than others, so Amendment 33 would insert a new subsection 2B which mentions redress. What do the Government imagine for those who may be more vulnerable and how do they think they might use this system? Obviously, I am thinking about children, but there could be other categories of users, certainly the elderly.
That led me to wonder what consideration has been given to vulnerable users more generally and how that is being worked through. That led to me to question exactly how this system is going to interact with the age-assurance work that the IC is doing as a result of the Online Safety Act and make sure that children are not forced into a position where they have to show their identity in order to prove their age or, indeed, cannot prove their identity because they have been deemed to have been dealt with elsewhere in another piece of legislation. Because, actually, children do open bank accounts and do have to have certain sorts of ID.
That led me to ask what in the framework prevents service providers giving more information than is required. I have read the Bill; someone said earlier that it is skeletal. From what we know, you can separate pieces of information, attributes, from each other, but what is to prevent a service provider not doing so? This is absolutely crucial to the trust in and workings of this system, and it leads me to the inverse, Amendment 46, which asks how we can prevent this system being forced and thrust upon people. As the noble Lord, Lord Clement-Jones, set out, we need to make sure that people have the right not to use the system as well as the right to use it.
Finally, I absolutely agree with the noble Viscount, Lord Colville, and the amendment in the name of the noble Viscount, Lord Camrose: something this fundamental must come back to Parliament. With that, I strongly associate myself with the words of the noble Lord, Lord Clement-Jones, on all his amendments.
I thank noble Lords for their comments and contributions in what has been an absolutely fascinating debate. I have a couple of points to make.
I agree with the noble Lord, Lord Clement-Jones, on his Amendment 33, on ongoing monitoring, and his Amendment 50. Where we part company, I think, is on his Amendment 36. I feel that we will never agree about the effectiveness or otherwise of five-year strategies, particularly in the digital space. I simply do not buy that his amendment will have the desirable effects that the noble Lord wants.
I do not necessarily agree with the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we should put extra burdens around the right to use non-digital methods. In my opinion, and I very much look forward to hearing from the Minister on this matter, the Act preserves that right quite well as it is. I look forward to the Government’s comments on that.
I strongly support the noble Viscount, Lord Colville, on his very important point about international standards. I had intended to sign his amendment but I am afraid that, for some administrative reason, that did not happen. I apologise for that, but I will sign it because I think that it is so important. In my opinion, not much of the Bill works in the absence of effective international collaboration around these matters. This is so important. We are particularly going to run up against this issue when we start talking about ADM, AI and copyright issues. It is international standards that will allow us to enforce any of the provisions that we put in here, so they are so important. I am more agnostic on whether this will happen via W3C, the ITU or other international standards bodies, but we really must go forward with the principle that international standards are what will get us over the line here. I look forward to hearing the Minister’s confirmation of the importance, in the Government’s view, of such standards.
Let me turn to the amendments listed in my name. Amendment 37 would ensure parliamentary oversight of the DVS trust framework. Given the volume of sensitive data that these services providers will be handling, it is so important that Parliament can keep an eye on how the framework operates. I thank noble Lords for supporting this amendment.
Amendment 40 is a probing amendment. To that end, I look forward to hearing the Minister’s response. Accredited conformity assessment bodies are charged with assessing whether a service complies with the DVS framework. As such, they are giving a stamp of approval from which customers will draw a sense of security. Therefore, the independence of these accreditation bodies must be guaranteed. Failing to do so would allow the industry to regulate itself. Can the Minister set out how the Government will guarantee the independence of these accreditation bodies?
Amendment 49 is also a probing amendment. It is designed to explore the cybersecurity measures that the Government expect of digital verification services. Given the large volume of data that these services will be handling, it is essential that the Government demand substantial cybersecurity measures. This is a theme that we are going to come back to again and again; we heard about it earlier, and I think that we will come on to more of this. As these services become more useful and more powerful, they present a bigger attack surface that we have to defend, and I look forward to hearing how we will do that.
I thank the noble Lords, Lord Clement-Jones and Lord Markham, the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for raising these topics around digital verification services. As I explained at Second Reading, these digital verification services already exist. They are already out there making all sorts of claims for themselves. With the new trust framework, we are trying to provide some more statutory regulation of the way that they operate. It is important that we have this debate and that we get it right, but some of the things we are doing are still work in progress, which is why we do not always have all the detailed answers that noble Lords are searching for here and why some powers have been left to the Secretary of State.
I shall go from the top through the points that have been raised. Amendments 33 and 43, tabled by the noble Lord, Lord Clement-Jones, and Amendment 40 tabled by the noble Viscount, Lord Colville, would require the trust framework to include rules on monitoring compliance and redress mechanisms and would require the Secretary of State to ensure the independence of accredited conformity assessment bodies. The noble Baroness, Lady Kidron, asked questions akin to those regarding redress for the vulnerable, and I will write to her setting out a response to that in more detail.
On the issue of redress mechanisms in the round, the scope of the trust framework document is solely focused on the rules that providers of digital verification services are required to follow. It does not include matters of governance. Compliance is ensured via a robust certification process where services are assessed against the trust framework rules. They are assessed by independent conformity assessment bodies accredited by the United Kingdom Accreditation Service, so some oversight is already being built into this model.
The Bill contains powers for the Secretary of State to refuse applications to the DVS register or to remove providers where he is satisfied that the provider has failed to comply with the trust framework or if he considers it necessary in the interests of national security. These powers are intended as a safety net, for example, to account for situations where the Secretary of State might have access to intelligence sources that independent conformity assessment bodies cannot assess and therefore will not be able to react to, or it could be that a particular failure of the security of one of these trust marks comes to light very quickly, and we want to act very quickly against it. That is why the Secretary of State has those powers to be able to react quickly in what might be a national security situation or some other potential leak of important data and so on.
In addition, conformity assessment bodies carry out annual surveillance audits and can choose to conduct spot audits on certified providers, and they have the power to withdraw certification where non-conformities are found. Adding rules on compliance would cut across that independent certification process and would be outside the scope of the trust framework. Those independent certification processes already exist.
Amendments 33, 41, 42, 44 and 45 tabled by the noble Lord, Lord Clement-Jones, would in effect require the creation of an independent appeals body to adjudicate on the refusal of an application to the DVS register and the implementation of an investigatory process applicable to refusal and removal from the DVS register. The powers of the Secretary of State in this regard are not without safeguards. They may be exercised only in limited circumstances after the completion of an investigatory process and are subject to public law principles, for example, reasonableness. They may also be challenged by judicial review.
To go back to the point I was making, it might be something where we would need to move quickly. Rather than having a convoluted appeals process in the way that the noble Lord was talking about, I hope he understands the need sometimes for that flexibility. The creation and funding of an independent body to adjudicate such a limited power would therefore be inappropriate.
It would be reassuring if the Minister could share with us some of the meetings that the Secretary of State or Ministers are having with those bodies on the subject of these internationally shared technical standards.
I might need to write to the noble Viscount, but I am pretty sure that that is happening at an official level on a fairly regular basis. The noble Viscount raises an important point. I reassure him that those discussions are ongoing, and we have huge respect for those international organisations. I will put the detail of that in writing to him.
I turn to Amendment 37, tabled by the noble Viscount, Lord Camrose, which would require the DVS trust framework to be laid before Parliament. The trust framework contains auditable rules to be followed by registered providers of digital verification services. The rules, published in their third non-statutory iteration last week on GOV.UK, draw on and often signpost existing technical requirements, standards, best practice, guidance and legislation. It is a hugely technical document, and I am not sure that Parliament would make a great deal of sense of it if it was put forward in its current format. However, the Bill places consultation on a statutory footing, ensuring that it must take place when the trust framework is being prepared and reviewed.
Amendments 36 and 38, tabled by the noble Lord, Lord Clement-Jones, would create an obligation for the Secretary of State to reconsult and publish a five-year strategy on digital verification services. It is important to ensure that the Government have a coherent strategy for enabling the digital verification services market. That is why we have already consulted publicly on these measures, and we continue to work with experts. However, given the nascency of the digital identity market and the pace of those technological developments, as the noble Viscount, Lord Camrose, said, forecasting five years into the future is not practical at this stage. We will welcome scrutiny through the publication of the annual report, which we are committed to publishing, as required by Clause 53. This report will support transparency through the provision of information, including performance data regarding the operation of Part 2.
Amendment 39, also tabled by the noble Lord, Lord Clement-Jones, proposes to exclude certified public bodies from registering to provide digital verification services. We believe that such an exclusion could lead to unnecessary restrictions on the UK’s young digital verification market. The noble Lord mentioned the GOV.UK One Login programme, which is aligned with the standards of the trust framework but is a separate government programme which gives people a single sign-on service to access public services. It uses different legal powers to operate its services from what is being proposed here. We do not accept that we need to exclude public bodies from the scrutiny that would otherwise take place.
Amendment 46 seeks to create a duty for organisations that require verification and use digital verification for that purpose to offer, where reasonably practicable, a non-digital route and ensure that individuals are made aware of both options for verification. I should stress here that the provision in the Bill relates to the provision of digital verification services, not requirements on businesses in general about how they conduct verification checks.
Ensuring digital inclusion is a priority for this Government, which is why we have set up the digital inclusion and skills unit within DSIT. Furthermore, there are already legislative protections in the Equality Act 2010 in respect of protected groups, and the Government will take action in the future if evidence emerges that people are being excluded from essential products and services by being unable to use digital routes for proving their identity or eligibility.
The Government will publish a code of practice for disclosure of information, subject to parliamentary review, highlighting best practice and relevant information to be considered when sharing information. As for Amendment 49, the Government intend to update this code only when required, so an annual review process would not be necessary. I stress to the Committee that digital verification services are not going to be mandatory. It is entirely voluntary for businesses to use them, so it is up to individuals whether they use that service or not. I think people are feeling that it is going to be imposed on people, and I would push against that proposal.
If the regulation-making power in Amendment 50 proposed by the noble Lord, Lord Clement-Jones, was used, it would place obligations on the Information Commissioner to monitor the volume of verification checks being made, using the permissive powers to disclose information created in the clause. The role of the commissioner is to regulate data protection in the UK, which already includes monitoring and promoting responsible data-sharing by public authorities. For the reasons set out above, I hope that noble Lords will feel comfortable in not pressing their amendments.
My Lords, Amendment 47 is in another slightly peculiar group, but we will persevere. It aims to bolster the cybersecurity framework for digital verification services providers. Needless to say, as we continue to advance in the digital age, it is vital that our online systems, especially those handling sensitive information, are protected against ever-evolving cyberthreats. As DVSs gain in currency as they gain in usage, the incentive for cyberattackers to attack them and try to take advantage grows. They need to be protected.
The proposed amendment therefore mandates the creation and regular review of cybersecurity rules for all DVS providers. These rules are designed to ensure that services involved in verifying identities and other critical data maintain the highest standards of protection, resilience and trustworthiness consonant with their importance and the sensitivity of any breaches of that data.
We could hardly be more aware that we live in an increasingly digital world where almost every aspect of our lives is connected online. Digital verification services play a key role in this landscape, and that role is going to increase. They are used by individuals and organisations to confirm identities, authenticate transactions and verify data. These services underpin critical areas, such as banking, healthcare and public services, where security is paramount. However, as the cyberthreat landscape becomes more sophisticated, so does the need for robust security measures to protect these services. Hackers and malicious actors are continuously developing new ways to exploit vulnerabilities in digital systems. This puts personal data, business operations and even national security at risk.
A security breach in a digital verification system could have devastating consequences not only for the immediate victims but for the reputation and integrity of the service providers. That is why we on these Benches feel that the proposed amendment is absolutely critical. It would ensure that all DVS providers are held to a high, standardised set of cybersecurity practices. This would not only reduce the risk of cyberthreats but build greater public trust in the safety and reliability of those services and, therefore, enhance their uptake.
One of the key aspects of the amendment is the requirement for the cybersecurity rules to be reviewed annually. This is especially important in the context of the rapid evolution of the cyberthreats that we face. Technologies, attack methods and vulnerabilities are constantly changing, and what is secure today may not be secure tomorrow. By reviewing the cyber rules every year, we will ensure that they remain current and effective in protecting against the latest threats. I beg to move.
I support that. I completely agree with all the points that the noble Lord, Lord Clement-Jones, made on the previous groupings, but the one that we all agree is absolutely vital is the one just brought up by my noble friend. Coming from the private sector, I am all in favour of a market—I think that it is the right way to go—but standards within that are equally vital.
I come at this issue having had the misfortune of having to manage the cyberattack that we all recall happening against our diagnostic services in hospitals last summer. We found that the weakest link there was through the private sector supplier to that system, and it became clear that the health service—or cybersecurity, or whoever it was—had not done enough to make sure that those standards were set, published and adhered to effectively.
With that in mind, and trying to learn the lessons from it, I think that this clause is vital in terms of its intent, but it will be valuable only if it is updated on a frequent basis. In terms of everything that we have spoken about today, and on this issue in particular, I feel that that point is probably the most important. Although everything that we are trying to do is a massive advance in terms of trying to get the data economy to work even better, I cannot emphasise enough how worrying that attack on our hospitals last summer was at the time.
I thank both noble Lords for raising this; I absolutely concur with them on how important it is. In fact, I remember going to see the noble Viscount, Lord Camrose, when he was in his other role, to talk about exactly this issue: whether the digital verification services were going to be robust enough against cyberattacks.
I pray in aid the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, who both felt that the new Cyber Security and Resilience Bill will provide some underpinning for all of this, because our Government take this issue very seriously. As the Committee can imagine, we get regular advice from the security services about what is going on and what we need to do to head it off. Yes, it is a difficult issue, but we are doing everything we can to make sure that our data is safe; that is fundamental.
Amendment 47 would require the Secretary of State to prepare and publish rules on cybersecurity for providers to follow. The existing trust framework includes rules on cybersecurity, against which organisations will be certified. Specifically, providers will be able to prove either that they meet the internationally recognised information security standards or that they have a security management system that matches the criteria set out in the trust framework.
I assure noble Lords that the Information Commissioner’s Office, the National Cyber Security Centre and other privacy stakeholders have contributed to the development of the trust framework. This includes meeting international best practice around encryption and cryptology techniques. I will happily write to noble Lords to reassure them further by detailing the range of protections already in place. Alternatively, if noble Lords here today would benefit from an official technical briefing on the trust framework, we would be delighted to set up such a meeting because it is important that we all feel content that this will be a robust system, for exactly the reasons that the noble Lord, Lord Markham, explained. We are absolutely on your Lordships’ side and on the case on all this; if it would be helpful to have a meeting, we will certainly do that.
I thank the Minister and my noble friend Lord Markham for those comprehensive and welcome comments. I would certainly like to take up the Minister’s offer of a technical briefing on the trust framework; that really is extremely important.
To go briefly off-piste, one sign that we are doing this properly will be the further development of an insurance marketplace for cybersecurity. It exists but is not very developed at the moment. As and when this information is regularly published and updated, we will see products becoming available that allow people to take insurance based on known risks around cybersecurity.
As I say, I take comfort from the Minister’s words and look forward to attending the tech briefing. When it comes, the cyber Bill will also play a serious role in this space and I look forward to seeing how, specifically, it will interact with DVS and the other services that we have been discussing and will continue to discuss. I beg leave to withdraw my amendment.
My Lords, I support these amendments and applaud the noble Lord, Lord Clement-Jones, for his temerity and for offering a variety of choices, making it even more difficult for my noble friend to resist it.
It has puzzled me for some time why the Government do not wish to see a firm line being taken about digital theft. Identity theft in any form must be the most heinous of crimes, particularly in today’s world. This question came up yesterday in an informal meeting about a Private Member’s Bill due up next Friday on the vexed question of the sharing of intimate images and how the Government are going to respond to it. We were sad to discover that there was no support among the Ministry of Justice officials who discussed the Bill with its promoter for seeing it progress any further.
At the heart of that Bill is the same question about what happens when one’s identity is taken and one’s whole career and personality are destroyed by those who take one’s private information and distort it in such a way that those who see it regard it as being a different person or in some way involved in activities that the original person would never have been involved in. Yet we hear that the whole basis on which this digital network has been built up is a voluntary one, and the logic of that is that it would not be necessary to have the sort of amendments that are before us now.
I urge the Government to think very hard about this. There must be a break point here. Maybe the meeting that has been promised will help us, but there is a fundamental point about whether in the digital world we can rely on the same protections that we have in the real world—and, if not, why not?
My Lords, I will address the amendments proposed by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron. I have nothing but the deepest respect for their diligence, and indeed wisdom, in scrutinising all three flavours of the Bill as it has come out, and for their commitment to strengthening the legislative framework against fraud and other misuse of digital systems. However, I have serious reservations about the necessity and proportionality of the amendments under consideration, although I look forward to further debates and I am certainly open to being convinced.
Amendments 51 and 52 would introduce criminal sanctions, including imprisonment, for the misuse of trust marks. While the protection of trust marks is vital for maintaining public confidence in digital systems, I am concerned that introducing custodial sentences for these offences risks overcriminalisation. The misuse of trust marks can and should be addressed through robust civil enforcement mechanisms. Turning every such transgression into a criminal matter would place unnecessary burdens on, frankly, an already strained justice system and risks disproportionately punishing individuals or small businesses for inadvertent breaches.
Furthermore, the amendment’s stipulation that proceedings could be brought only by or with the consent of the Director of Public Prosecutions or the Secretary of State is an important safeguard, yet it underscores the high level of discretion required to enforce these provisions effectively, highlighting the unsuitability of broad criminalisation in this context.
Amendment 53 seeks to expand the definition of identity documents under the Identity Documents Act 2010 to include digital identity documents. While the noble Lord, Lord Clement-Jones, makes a persuasive case, the proposal raises two concerns. First, it risks pre-emptively criminalising actions before a clear and universally understood framework for digital identity verification is in place. The technology and its standards are still evolving, and it might be premature to embed such a framework into criminal law. Secondly, there is a risk that this could have unintended consequences for innovation in the digital identity sector. Businesses and individuals navigating this nascent space could face disproportionate legal risks, which may hinder progress in a field critical to the UK’s digital economy.
Amendment 54 would introduce an offence of knowingly or recklessly providing false information in response to notices under Clause 51. I fully support holding individuals accountable for deliberate deception, but the proposed measure’s scope could lead to serious ambiguities. What constitutes recklessness in this context? Are we inadvertently creating a chilling effect where individuals or businesses may refrain from engaging with the system for fear of misinterpretation or error? These are questions that need to be addressed before such provisions are enshrined in law.
We must ensure that our legislative framework is fit for purpose, upholds the principles of justice and balances enforcement with fairness. The amendments proposed, while they clearly have exactly the right intentions, risk, I fear, undermining these principles. They introduce unnecessary criminal sanctions, create uncertainty in the digital identity space and could discourage good-faith engagement with the regulatory system. I therefore urge noble Lords to carefully consider the potential consequences of these amendments and, while expressing gratitude to the noble Lords for their work, I resist their inclusion in the Bill.
My Lords, of course we want to take trust seriously. I could not agree more that the whole set of proposals is predicated on that. Noble Lords have all made the point, in different ways, that if there is not that level of trust then people simply will not use the services and we will not be able to make progress. We absolutely understand the vital importance of all that. I thank all noble Lords for their contributions on this and I recognise their desire to ensure that fraudulent use of the trust mark is taken seriously, as set out in Amendments 51 and 52.
The trust mark is in the process of being registered as a trademark in the UK. As such, once that is done, the Secretary of State will be able to take appropriate legal action for misuse of it. Robust legal protections are also provided through Clause 50, through the trademark protections, and through other existing legislative provisions, such as the Consumer Protection from Unfair Trading Regulations 2008. There is already legislation that underpins the use of that trust mark. Additionally, each trust mark will have a unique number that allows users to check that it is genuine. These amendments would duplicate those existing protections.
In seeking to make the misuse of a digital identity a criminal offence, which Amendments 53 and 209 attempt to do, the noble Lord offered me several different ways of approaching this, so I will offer him some back. The behaviour he is targeting is already addressed in the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018. We would argue that it is already by existing legislation.
On the noble Lord’s point about the Identity Documents Act 2010, defining every instance of verification as an identity document within the scope of offences in that Act could create an unclear, complicated and duplicative process for the prosecution of digital identity theft. The provision of digital verification services does not always create one single comprehensive identity proof—I think this is the point that the noble Viscount, Lord Camrose, was making. People use it in different ways. It might be a yes/no check to ensure that a person is over 18, or it might be a digital verification services provider providing several derived credentials that can be used in different combinations for different use cases. We have to be flexible enough to be able to deal with that and not just make one fraudulent act. It would not be appropriate to add digital identity to the list of documents set out in the Identity Documents Act.
Amendment 54 would create an offence of supplying false information to the Secretary of State, but sanctions already exist in this situation, as the organisation can be removed from the DVS register via the power in Clause 41. Similarly, contractual arrangements between the Office for Digital Identities and Attributes and conformity assessment bodies require them to adhere to the principle of truthfulness and accuracy. To create a new offence would be disproportionate when safeguards already exist. I take on board the intent and aims of the noble Lord, Lord Clement-Jones, but argue that there are already sufficient protections in current law and in the way in which the Bill is drafted to provide the reassurance that he seeks. Therefore, I hope that he feels comfortable in not pressing his amendment.
My Lords, I am confident that, somewhere, there is a moral philosopher and legal scholar who can explain why this amendment is not part of the next group on NUAR but, in the meantime, my amendment addresses a fundamental issue. It would ensure that strict security measures are in place before any individual or organisation is allowed access to the sensitive information held on the National Underground Asset Register. The NUAR is a crucial tool for managing the UK’s underground infrastructure. It holds critical data about pipelines, cables and other assets that underpin vital services such as water, energy, telecommunications and transport.
This information, while essential for managing and maintaining infrastructure, is also a potential target for misuse. As such, ensuring the security of this data is not just important but vital for the safety and security of our nation. The information contained in the NUAR is sensitive. Its misuse could have disastrous consequences. If this data were to fall into the wrong hands, whether through criminal activities, cyberattacks or terrorism, it could be exploited to disrupt or damage critical infrastructure. I know that the Government take these risks seriously but this amendment seeks to address them further by ensuring that only those with a legitimate need, who have been properly vetted and who have met specific security requirements can access this data. We must ensure that the people accessing this register are trusted individuals or organisations that understand the gravity of handling this sensitive information and are fully aware of the risks involved.
The amendment would ensure that we have a framework for security—one that demands that the Secretary of State introduces clear, enforceable regulations specifying the security measures that must be in place before anyone can access the NUAR. These measures may include: background checks to ensure that those seeking access are trustworthy and legitimate; cybersecurity safeguards to prevent unauthorised digital access or breaches; physical security measures to protect the infrastructure where this information is stored; and clear guidelines on who should be allowed access and the conditions under which they can view this sensitive data.
The potential threats posed by unsecured access to the NUAR cannot be overstated. Criminals could exploit this information to target and disrupt key infrastructure systems. Terrorist organisations could use it to plan attacks on essential services, endangering lives and causing mass disruption. The stakes are incredibly high; I am sure that I do not need to convince noble Lords of that. In an era where digital and physical infrastructure are increasingly interconnected, the risks associated with unsecured access to information of the kind held in the NUAR are growing every day. This amendment would address this concern head on by requiring that we implement safeguards that are both thorough and resilient to these evolving threats. Of course, the cyber Bill is coming, but I wonder whether we need something NUAR-specific and, if so, whether we need it in this Bill. I beg to move.
I thank the noble Viscount for raising the issue of the National Underground Asset Register’s cybersecurity. As he said, Amendment 55 seeks to require more detail on the security measures in the regulations that will be applied to the accessing of NUAR data.
The noble Viscount is right: it is absolutely fundamental that NUAR data is protected, for all the reasons he outlined. It hosts extremely sensitive data. It is, of course, supported by a suite of sophisticated security measures, which ensure that the very prescribed users’ access to data is proportionate. I hope that the noble Viscount understands that we do not necessarily want to spell out what all those security measures are at this point; he will know well enough the sorts of discussions and provisions that go on behind the scenes.
Security stakeholders, including the National Cyber Security Centre and the National Protective Security Authority, have been involved in NUAR’s development and are members of its security governance board, which is a specific governance board overseeing its protection. As I say, access to it occurs on a very tight basis. No one can just ask for access to the whole of the UK’s data on NUAR; it simply is not geared up to be operated in that way.
We are concerned that the blanket provision proposed in the amendment would lead to the publication of detailed security postures, exposing arrangements that are not public knowledge. It could also curtail the Government’s ability to adapt security measures when needed and, with support from security stakeholders, to accommodate changing circumstances—or, indeed, changing threats—that we become aware of. We absolutely understand why the noble Viscount wants that reassurance. I can assure him that it is absolutely the best security system we could possibly provide, and that it will be regularly scrutinised and updated; I really hope that the noble Viscount can take that assurance and withdraw his amendment.
I thank the Minister for that answer. Of course, I take the point that to publish the security arrangements is somehow to advertise them, but I am somehow not yet altogether reassured. I wonder whether there is something that we can push further as part of a belt-and-braces approach to the NUAR security arrangements. We have talked about cybersecurity a lot this afternoon. All of these things tend to create additional incentives towards cyberattacks —if anything, NUAR does so the most.
If it helps a little, I would be very happy to write to the noble Viscount on this matter.
Yes, that would be great. I thank the Minister. I beg leave to withdraw my amendment.
I thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.
In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—
For the record, the noble Viscount is getting a vigorous nod from the Minister.
We will see, but such a demonstration would certainly ease any perfectly reasonable concerns that might emerge. To put it in a more colourful way, this is Netflix in the age of Blockbuster Video.
The slightly different Amendments 193, 194 and 195 clarify that these information standards should explicitly apply to IT providers involved in the processing of data within primary as well as secondary care, and that the standards must extend to existing contracts with providers, not just new agreements formed after this Act. I understand the point of these amendments but I am slightly concerned about how the retroactivity would affect existing contractual agreements. I am also slightly concerned about the wish to hard-code certain conditions into rules that function best the more they are principles-based and the less they are specifically related to particular areas of technology. That said, I think I am persuadable on it, but I have not yet made that leap.
I am not going to say much except to try to persuade my noble friend. I am absolutely with the intent of what the noble Lord, Lord Clement-Jones, is trying to do here and I understand the massive benefits that can be gained from it.
My Lords, there is a great deal to be gained from digitising the registers of births, stillbirths and deaths. Not only does it reduce the number of physical documents that need to be maintained and kept secure but it means that people do not have to physically sign the register of births or deaths in the presence of a registrar. This will make people’s lives a great deal easier during those stressful periods of their lives.
However, digitising all this data—I am rather repeating arguments I made about NUAR and other things earlier—creates a much larger attack surface for people looking to steal personal data. This amendment explores how the Government will protect this data from malign actors. If the Minister could provide further detail on this, I would be most grateful.
This is a probing amendment and has been tabled in a constructive spirit. I know that we all want to harness the power of data and tech in this space and use it to benefit people’s lives but, particularly with this most personal of data, we have to take appropriate steps to keep it secure. Should there be a data breach, hackers would have access to an enormous quantity of personal data. Therefore, I suggest that, regardless of how much thought the Government have given this point up to now, the digitisation of these registers should not occur until substantial cybersecurity measures are in place. I look forward to the Minister’s comments.
On Amendment 57, legislation is already in place to ensure the security of electronic registers. Articles 25 and 32 of the UK General Data Protection Regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures, so this already applies.
The electronic system has been in place for births and deaths since 2009, and all events have been registered electronically since that date, in parallel with the paper registers and with no loss of data. What is happening with this legislation is that people do not have to keep paper records anymore; it is about the existing electronic system. The noble Lord will remember that it is up to registrars even so, but I think that the idea is that they will no longer have to keep the paper registers as well, which everybody felt was an unnecessary administrative burden.
Nevertheless, the system is subject to Home Office security regulations, and robust measures are in place to protect the data. There has been no loss of data or hacking of that data up to now. Obviously, we need to make sure that the security is kept up to date, but we think that it is a pretty robust system. It is the paper documents that are losing out here.
I thank the Minister. I take the point that this has been ongoing for a while and that, in fact, the security is better because there is less reliance on the paper documents. That said, I am encouraged by her answer and encouraged that the Government continue to anticipate this growing risk and act accordingly. On that basis, I withdraw the amendment.
My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.
Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.
I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.
I thank the noble Lord, Lord Clement-Jones, for raising this, and the noble Lord, Lord Stevenson, for raising the possibility that we are in the presence of a digital avatar of the noble Lord, Lord Clement-Jones. It is a scary thought, indeed.
The amendment requires a review of the operation of the Tell Us Once programme, which seeks to provide a simpler mechanism for citizens to pass information regarding births and deaths to the Government. It considers whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data. When I read the amendment, I was more cynical than I am now, having heard what the noble Lord, Lord Clement-Jones, had to say. I look forward to hearing the Minister’s answers. I take the point from the noble Lord, Lord Stevenson, that we do not necessarily need another review—but now that I have heard about it, it feels a better suggestion than I thought it was when reading about it.
I worry that expanding this programme to non-public sector holders of data would be a substantial undertaking; it would surely require the Government to hold records of all the non-public sector organisations that have retained and processed an individual’s personal data. First, I am not sure that this would even be possible—or practicable, anyway. Secondly, I am not sure that it would end up being an acceptable level of state surveillance. I look forward to hearing the Minister’s response but I am on the fence on this one.
(1 month ago)
Lords ChamberMy Lords, let me start by repeating the thanks others have offered to the Minister for her ongoing engagement and openness, and to the Bill team for their—I hope ongoing—helpfulness.
Accessing and using data safely is a deeply technical legislative subject. It is, perhaps mysteriously, of interest to few but important to more or less everyone. Before I get started, I will review some of the themes we have been hearing about. Given the hour, I will not go into great detail about most of them, but I think it is worth playing some of them back.
The first thing that grabbed me, which a number of noble Lords brought up, was the concept of data as an asset. I believe the Minister used the phrase “data as DNA”, and that is exactly the right metaphor. Whether data is a sovereign asset or on the balance sheet of a private organisation, that is an incredibly important and helpful way to see it. A number of noble Lords brought this up, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Knight and Lord Stevenson of Balmacara.
I was pleased that my noble friend Lord Lucas brought up the use of AI in hiring, if only because I have a particular bee in my bonnet about this. I have taken to writing far too many grumpy letters to the Financial Times about it. I look forward to engaging with him and others on that.
I was pleased to hear a number of noble Lords raise the issue of the burdens on small business and making sure that those burdens, in support of the crucial goal of protecting privacy, do not become disproportionate relative to the ability of small businesses to execute against them. The noble and learned Lord, Lord Thomas, the noble Lords, Lord Stevenson of Balmacara and Lord Bassam, and my noble friend Lord Markham brought that up very powerfully.
I have cheated by making an enormous group of themes, including ADM, AI and text and data mining—and then I have added Horizon on at the end. It is thematically perhaps a little ambitious, but we are getting into incredibly important areas for the well-being and prosperity of so many people. A great many noble Lords got into this very persuasively and compellingly, and I look forward to a great deal of discussion of those items as we go into Committee.
Needless to say, the importance of adequacy came up, particularly from the noble Lords, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas. There is a key question here: have we reduced the risk of loss of adequacy to as close to zero as we can reasonably get, while recognising that it is a decision that is essentially out of our sovereign hands?
A number of noble Lords brought up the very tricky matter of the definition of scientific research—among them the noble Viscount, Lord Colville, my noble friend Lord Bethell and the noble Lords, Lord Davies of Brixton and Lord Freyberg. This is a significant challenge to the effectiveness of the legislation. We all know what we are trying to achieve, but the skill and the art of writing it down is a considerable challenge.
My final theme, just because I so enjoyed the way in which it was expressed by the noble Lord, Lord Knight, is the rediscovery of the joys of a White Paper. That is such an important point—to have the sense of an overall strategy around data and technology as well as around the various Bills that came through in the previous Parliament and will, of course, continue to come now, as these technologies develop so rapidly.
My noble friend Lord Markham started by saying that we on these Benches absolutely welcome the Government’s choice to move forward with so many of the provisions originally set out in the previous Government’s DPDI Bill. That Bill was built around substantial consultation and approved by a range of stakeholders. We are particularly pleased to see the following provisions carried forward. One is the introduction of a national underground asset register. As many others have said, it will not only make construction and repairs more efficient but make them safer for construction workers. Another is giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of any child death. I notice the noble Baroness, Lady Kidron, nodding at that—and I am delighted that it remains.
On reforming and modernising the ICO, I absolutely take the point raised by some that this is an area that will take quite considerable questioning and investigation, but overall the thrust of the purpose of modernising that function is critical to the success of the Bill. We absolutely welcome the introduction of a centralised digital ID verification framework, recognising noble Lords’ concerns about it, of course, and allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes.
That said, there are provisions that were in the old DPDI Bill whose removal we regret, many of which we felt would have improved data protection and productivity by offering SMEs in particular greater agency to deal with non-high-risk data in less cumbersome ways while still retaining the highest protections for high-risk data. I very much welcome the views so well expressed by the noble and learned Lord, Lord Thomas of Cwmgiedd, on this matter. As my noble friend Lord Markham put it, this is about being wisely careful but not necessarily hyper-careful in every case. That is at least a way of expressing the necessary balance.
I regret, for example—the noble Lord, Lord Clement-Jones, possibly regrets this less than I do—that the Government have chosen to drop the “vexatious and excessive” standard for subject access requests to refer to “manifestly unfounded or excessive”. The term “vexatious” emerged from extensive consultation and would, among other things, have prevented the use of SARs to circumvent courts’ discovery processes. I am concerned that, by dropping this definition, the Government have missed an opportunity to prevent misuse of the deeply important subject access rights. I hope very much to hear from the Minister how the Government propose to address such practices.
In principle, we do not approve of the Government giving themselves the power to gain greater knowledge of citizens’ activities. Indeed, the Constitution Committee has made it clear that any legislation dealing with data protection must carefully balance the use of personal data by the state for the provision of services and for national security purposes against the right to a private life and freedom of expression. We on these Benches feel that, on the whole, the DPDI Bill maintained the right balance between those two opposing legislative forces. However, we worry that the DUA Bill, if used in conjunction with other powers that have been promised in the fraud, error and debt Bill, would tip too far in favour of government overreach.
Part 1 of the Bill, on customer and business data, contains many regulation-making powers. The noble Viscount, Lord Colville, my noble friend Lord Holmes and the noble Lord, Lord Russell, spoke powerfully about this, and I would like to express three concerns. First, the actual regulations affecting vast quantities of business and personal data are not specified in the Bill; they will be implemented through secondary legislation. Will the Minister give us some more information, when she stands up, about what these regulations may contain? This concern also extends to Part 2, on digital verification services, where in Clause 28,
“The Secretary of State must prepare and publish … rules concerning the provision of digital verification services”.
The Select Committee on the Constitution has suggested that this power should be subject to parliamentary scrutiny. I must say that I am minded to agree.
Secondly, throughout Part 1, regulation-making powers are delegated to both the Secretary of State and the Treasury. This raises several questions. Can the Secretary of State and the Treasury make regulations independently of one another? In the event of a disagreement between these government departments, who has the final say, and what are the mechanisms should they disagree? We would welcome some commentary and explanation from the Minister.
Thirdly, as the Select Committee on the Constitution has rightly pointed out, Clause 133 contains a Henry VIII power. It allows the Secretary of State, by regulations, to make consequential amendments to the provisions made by this Bill. This allows amendments to any
“enactment passed or made before the end of the Session in which this Act is passed”.
Why is this necessary?
The Bill introduces some exciting new terminology, namely “data holder” and data “trader”. Will the Minister tell the House what these terms mean and why they need to coexist alongside the existing terminology of “data processor” and “data controller”? I certainly feel that data legislation is quite complex enough without adding overlapping new terminology if we do not really need it.
I stress once again the concerns rightly raised by my noble friend Lord Markham about NUAR security. Are the Government satisfied that the operational protection of NUAR is sufficient to protect this valuable information from terrorist and criminal threats? More generally, additional cybersecurity measures must be implemented to protect personal data during this mass digitisation push. Will the Minister tell the House how these necessary security measures will be brought forward?
Finally, as I am sure all noble Lords will recall, the previous Government published a White Paper that set out five principles for AI. As a reminder, those were: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress. I am minded to table an amendment to Clause 80, requiring those using AI in their automated decision-making process to have due regard for these five principles. I noted with interest that the noble Lord, Lord Stevenson of Balmacara, proposed something very similar but using the Bletchley principles. I am very keen to explore that further, on the grounds that it might be an interesting way of having principles-driven AI inserted into this critical Bill.
In conclusion, we on these Benches are broadly supportive of the Bill. We do, as I have set out, have a few concerns, which I hope the Minister will be willing to listen to.
(1 month, 3 weeks ago)
Lords ChamberWe are acutely aware of this issue. We know that there is a live ongoing argument about it and we are talking to our colleagues across government to find a way through, but we have not come to a settled view yet.
My Lords, catfishing is, of course, one of the misuses of technology in respect of which AI is rapidly enhancing both the attack and the defence. Does the Minister agree that the most effective, adaptive and future-proof defence against catfishing is actually personal awareness and resilience? If so, can the Minister provide a bit more of an update on the progress made in implementing this crucial media literacy strategy, which will be such an important part of defending us all against these attacks in future?
Ofcom published its latest vision of the media literacy strategy just a couple of months ago, so its implementation is very much in its infancy. The Government very much support it and we will work with Ofcom very closely to roll it out. So Ofcom has a comprehensive media literacy strategy on these issues, but as we all know, schools have to play their part as well: it has to be part of the curriculum. We need to make sure that children are kept safe in that way.
The noble Viscount referred to AI. The rules we have—the Online Safety Act and so on—are tech-neutral in the sense that, even if an image is AI generated, it would still fall foul of that Act; it does not matter whether it is real or someone has created it. Also, action should be taken by the social media companies to take down those images.
(1 month, 3 weeks ago)
Grand CommitteeMy Lords, I begin with a comment that I hope will not be taken badly by either my noble friend the Minister or the large number of civil servants who have been involved in this Bill over the years. Colleagues may recall that the Bill took seven years to pass through the various processes and procedures of Parliament, including initial Green Papers and White Papers and then scrutiny by the Joint Select Committee, of which my noble friend opposite was also a member, and it seems slightly surprising and a bit odd that we are dealing with what seems to be an administrative oversight so late in the process. I do not expect a serious response from the Minister on that, but I wanted to put on the record that we are still very much aware of the fact that legislation has its faults and sometimes needs to be corrected, and we should perhaps be humble in expecting that the material we finally agree in Parliament is indeed the last word on things.
Having said that, I think I follow the noble Lord, Lord Clement-Jones, on this point: the subsequent legal analysis, which has identified a potential gap in provision on this instrument, tries to tidy it up but, in doing so, has left me a bit confused. I simply ask the Minister to make it clear to me when she responds that I am reading it correctly. The worry that has been exposed by this subsequent legal analysis is about the sharing of information when Ofcom is using its powers to address issues with the companies with which it has an engagement. Indeed, the whole purpose of the Bill is to ensure that companies are taking their burden of making sure that the Bill works in practice. There may be a deficiency in terms of what the Secretary of State has separate powers to do, but my confusion is that the Explanatory Memorandum says:
“The Secretary of State has several key functions relating to the implementation of the framework under the”
Online Safety Act. It is obviously sensible, therefore, that the sharing of information that Ofcom gathers is available for that. But is that all the powers of the Secretary of State or only the powers of the Secretary of State in relation to the Online Safety Act? The Explanatory Memorandum says:
“If Ofcom were not able to share business information relating to these areas”—
that is, the areas directly affected by the Online Safety Act—
“there is a risk that implementation and review of the framework could be delayed or ineffective”.
I accept the general point, but, to pull up the point made by the noble Lord, Lord Clement-Jones, is this an open invitation for Ofcom to share information that does not relate to its powers in relation to the Online Safety Act with the Secretary of State and, therefore, something for the Secretary of State to take on as a result of a slightly uncertain way of doing it? Are there are any restrictions to this power as set out in that paper? I could mention other points where it comes up, but I think my point is made.
The noble Lord, Lord Clement-Jones, also touched on the point that this is a power for Ofcom to share with the Secretary of State responsible for Ofcom, which is fair enough, but, as the Explanatory Memorandum points out:
“There are also certain functions relating to definitions conferred on Scottish and Welsh Ministers and Northern Ireland departments”—
presumably now Ministers—which may also be “relevant persons” of the Act, but we are not given much on that, except that
“these are unlikely to require business information for their exercise”.
I would like a bit more assurance on that. Again, that might be something for which the department is not prepared and I am quite happy to receive a letter on it, but my recollection from the discussions on the Online Safety Bill in this area, particularly in relation to Gaelic, was that there were quite a lot of powers that only Scottish Ministers would be able to exercise, and therefore it is quite possible that business activities which would not be UK-wide in their generality and therefore apropos of the Secretary of State might well be available to Ofcom to share with Scottish Ministers. If it is possible to get some generic points about where that is actually expected to fall, rather than simply saying that it is unlikely to require business information, I would be more satisfied with that.
My Lords, I thank the Minister for setting out this instrument so clearly. It certainly seems to make the necessary relatively simple adjustments to fill an important gap that has been identified. Although I have some questions, I will keep my remarks fairly brief.
I will reflect on the growing importance of both the Online Safety Act and the duty we have placed on Ofcom’s shoulders. The points made by the noble Lord, Lord Clement-Jones, about the long-standing consequential nature of the creation of Ofcom and the Communications Act were well made in this respect. The necessary complexity and scope of the work of Ofcom, as our online regulator, has far outgrown what I imagine was foreseeable at the time of its creation. We have given it the tasks of developing and enforcing safety standards, as well as issuing guidance and codes of practice that digital services must follow to comply with the Act. Its role includes risk assessment, compliance, monitoring and enforcement, which can of course include issuing fines or mandating changes to how services operate. Its regulatory powers now allow it to respond to emerging online risks, helping to ensure that user-protection measures keep pace with changes in the digital landscape.
In recognising the daily growing risk of online dangers and the consequent burdens on Ofcom, we of course support any measures that bring clarity and simplicity. If left unaddressed, the identified gap here clearly could lead to regulatory inefficiencies and delays in crucial processes that depend on accurate and up-to-date information. For example, setting appropriate fee thresholds for regulated entities requires detailed knowledge of platform compliance and associated risks, which would be challenging to achieve without full data access. During post-implementation reviews, a lack of access to necessary business information could hamper the ability to assess whether the Act is effectively achieving its safety objectives or whether adjustments are needed.
That said, I have some questions, and I hope that, when she rises, the Minister will set out the Government’s thinking on them. My first question very much picks up on the point made—much better than I did—by the noble Lord, Lord Stevenson of Balmacara. It is important to ensure that this instrument does not grant unrestricted access to business information but, rather, limits sharing to specific instances where it is genuinely necessary for the Secretary of State to fulfil their duties under the Act. How will the Government ensure this?
Secondly, safeguards, such as data protection laws and confidentiality obligations under the Communications Act 2003, must be in place to guarantee that any shared information is handled responsibly and securely. Do the Government believe that sufficient safeguards are already in place?
Thirdly, in an environment of rapid technology change, how do the Government plan to keep online safety regulation resilient and adaptive? I look forward to hearing the Government’s views on these questions, but, as I say, we completely welcome any measure that increases clarity and simplicity and makes it easier for Ofcom to be effective.
I thank noble Lords for their valuable contributions to this debate. It goes without saying that the Government are committed to the effective implementation of the Online Safety Act. It is critical that we remove any barriers to that, as we are doing with this statutory instrument.
As noble Lords said—the noble Viscount, Lord Camrose, stressed this—the Online Safety Act has taken on a growing significance in the breadth and depth of its reach. It is very much seen as an important vehicle for delivering the change that the whole of society wants now. It is important that we get this piece of legislation right. For that purpose, this statutory instrument will ensure that Ofcom can co-operate and share online safety information with the Secretary of State where it is appropriate to do so, as was intended during the Act’s development.
On specific questions, all three noble Lords who spoke asked whether the examples given were exclusive or whether there are other areas where powers might be given to the Secretary of State. The examples given are the two areas that are integral to implementation. We have not at this stage identified any further areas. The instrument would change to allow sharing only for the purposes of fulfilling the Secretary of State’s functions under the Online Safety Act—it does not go any broader than that. I think that answers the question asked by the noble Viscount, Lord Camrose, about whether this meant unlimited access—I assure him that that is not the purpose of this SI.
My noble friend Lord Stevenson asked whether this relates only to the powers under the OSA. Yes, the instrument allows Ofcom to share information it has collected from businesses only for the purposes of fulfilling the Secretary of State’s functions under the Act.
On the question of devolution, the powers of Scottish, Northern Ireland and Welsh Ministers primarily relate to the power to define the educational establishments for the purpose of Schedule 1 exemptions. There are also some consultation provisions where these Ministers must be consulted, but that is the limit of the powers that those Ministers would have.
I am conscious that I have not answered all the questions asked by the noble Viscount, Lord Camrose, because I could not write that quickly—but I assure him that my officials have made a note of them and, if I have not covered those issues, I will write to him.
I hope that noble Lords agree with me on the importance of implementing the Online Safety Act and ensuring that it can become fully operational as soon as possible. I commend these regulations to the Committee.
(1 month, 3 weeks ago)
Grand CommitteeMy Lords, I started my discussion on the previous instrument on a slightly negative note. I want to change gear completely now and say how nice it is to see the first of the SIs relating to the Online Safety Act come forward. I welcome that.
Having said that, may I inquire what the Government’s intention is in relation to the Parkinson rule? I think I am correct in saying that we wish to see in place an informal but constant process by the Government when they bring forward legislation under the Online Safety Act, which would be offered to the standing committees so that they could comment and make advice available to Ministers before the Secretary of State finally approved any such legislation. This would primarily be concerned with the codes of practice, but this is exactly the sort of issue, well exemplified by the noble Baroness, Lady Owen, where there is still some concern about the previous Government’s approach to this Bill.
If I recall, this rule was in one of the later amendments brought in towards the end of the process. Rather unlike the earlier stuff, which was seven years in the making, this was rushed through in rather less than seven weeks as we got to the end of discussions on the Online Safety Bill. To get the deal that we all, across the political parties, hoped would happen, and so that the country would benefit from the best possible Act we could get out of the process, there were a number of quite late changes, including the question about deepfake issues, which was not given quite the scrutiny that it could have had. Of course, we are now receiving discussion and debate on those issues, and it is important that we understand them and the process that the Government will take to try to resolve them.
This question of having consent was hotly debated by those who led on it during the time the Bill was before your Lordships’ House. I felt the arguments very clearly came out in favour of those who argued that the question of consent, as mentioned by the noble Lord, Lord Clement-Jones, really is not relevant to this. The offence is caused by the circulation of material, and the Act should contain powers sufficient for the Secretary of State to be satisfied that Ofcom, in exercising its regulatory functions, has the powers to take down this material where it is illegal.
There are two issues tied up in that. I think all of us who have spoken in this debate are concerned that we have not really got to the end of the discussion on this, and we need to have more. Whether through the Private Member’s Bill that we will hear about in December or not, the Government need to get action on that. They need to consult widely with the committees, both in the Commons and here, to get the best advice. It may well be that we need further debate and discussion in this House to do so.
Having said that, the intention to clarify what exactly is legal lies at the heart of the Online Safety Act. The Act will not work and benefit the country if we go back to the question of legal but harmful. The acid test for how the material is to be treated by those who provide services to this country has to be whether it is legal. If it is illegal, it must be taken down, and there must be powers and action specifically for that to happen. It is unfortunate that, if material is not illegal, it is a matter not for the Government or Parliament but for the companies to ensure that their terms of service allow people to make judgments about whether they put material on their platforms. I hope that still remains the Government’s position. I look forward to hearing the Minister’s response.
My Lords, I shall also start on a positive note and welcome the ongoing focus on online safety. We all aim to make this the safest country in the world in which to be online. The Online Safety Act is the cornerstone of how all of us will continue to pursue this crucial goal. The Act imposed clear legal responsibilities on social media platforms and tech companies, requiring them actively to monitor and manage the content they host. They are required swiftly to remove illegal content and to take proactive measures to prevent harmful material reaching minors. This reflects the deep commitment that we all share to safeguarding children from the dangers of cyberbullying, explicit content and other online threats.
We must also take particular account of the disproportionate harm that women and girls face online. The trends regarding the online abuse and exploitation that disproportionately affect female users are deeply concerning. Addressing these specific challenges is essential if we are to create a truly safe online environment for everyone.
With respect to the Government’s proposed approach to making sharing intimate images without consent a priority offence under the Online Safety Act, this initiative will require social media companies promptly to remove such content from their platforms. This aims to curb the rise in abuse that has been described as “intolerable”—I think rightly—by the Secretary of State. The intent behind this measure is to prevent generations becoming “desensitised” to the devastating effects of online abuse.
Although this appears to signal a strong stance against online harm, it raises the question of what this designation truly accomplishes in practical terms. I am grateful to the Minister for setting this out so clearly. I am not entirely sure that I altogether followed the differences between the old offences and the new ones. Sharing intimate images without consent is already illegal under current laws. Therefore, can we not say that the real issue lies in the absence not of legal provision but of effective enforcement of existing regulation? We have to ensure that any changes we make do not merely add layers of complexity but genuinely strengthen the protections available to victims and improve the responsiveness of platforms in removing harmful content.
With these thoughts in mind, I offer five questions. I apologise; the Minister is welcome to write as necessary, but I welcome her views whether now or in writing. First, why is it necessary to add the sharing of intimate images to the list of priority offences if such acts are already illegal under existing legislation and, specifically, what additional protections or outcomes are expected? The Minister gave some explanation of this, but I would welcome digging a little deeper into that.
Secondly, where consent is used as a defence against the charge of sharing intimate images, what are the Government’s thoughts on how to protect victims from intrusive cross-examination over details of their sexual history?
Thirdly, with respect to nudification technology, the previous Government argued that any photoreal image was covered by “intimate image abuse”—the noble Lord, Lord Clement-Jones, touched on this issue well. Is there any merit in looking at that again?
Fourthly, I am keen to hear the Government’s views on my noble friend Lady Owen’s Private Member’s Bill on nudification. We look forward to debating that in December.
Fifthly, and lastly, what role can or should parents and educators play in supporting the Act’s objectives? How will the Government engage these groups to promote online safety awareness?
My Lords, I thank noble Lords for their contributions to this debate. This is, as I think all noble Lords who have spoken recognise, a really important issue. It is important that we get this legislation right. We believe that updating the priority offences list with a new intimate image abuse offence is the correct, proportionate and evidence-led approach to tackle this type of content, and that it will provide stronger protections for online users. This update will bring us closer to achieving the commitment made in the Government’s manifesto to strengthening the protection for women and girls online.
I will try to cover all the questions asked. My noble friend Lord Stevenson and the noble Baroness, Lady Owen, asked whether we will review the Act and whether the Act is enough. Our immediate focus is on getting the Online Safety Act implemented quickly and effectively. It was designed to tackle illegal content and protect children; we want those protections in place as soon as possible. Having said that, it is right that the Government continually assess the law’s ability to keep up, especially when technology is moving so fast. We will of course look at how effective the protections are and build on the Online Safety Act, based on the evidence. However, our message to social media companies remains clear: “There is no need to wait. You can and should take immediate action to protect your users from these harms”.
The noble Baroness, Lady Owen, asked what further action we are taking against intimate abuse and about the taking, rather than sharing, of intimate images. We are committed to tackling the threat of violence against women and girls in all forms. We are considering what further legislative measures may be needed to strengthen the law on taking intimate images without consent and image abuse. This matter is very much on the Government’s agenda at the moment; I hope that we will be able to report some progress to the noble Baroness soon.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Owen, asked whether creating and making intimate image deepfakes will be an offence. The Government’s manifesto included a commitment to banning the creation of sexually explicit deepfakes. This is a priority for the Government. DSIT is working with the Home Office and the Ministry of Justice to identify the most appropriate legislative vehicle for ensuring that those who create these images without consent face the appropriate punishment. The Government are considering options in this space to protect women and girls from malicious uses of these technologies. The new sharing intimate images offence, which will be added to the OSA priority list through this SI, explicitly includes—for the first time—wholly synthetic manufactured images, such as deepfakes, so they will be tackled under the Online Safety Act.
The noble Baroness, Lady Owen, asked about the material that is already there and the ability to have a hash database to prevent those intimate images continually being circulated. We are aware that the technology exists. Strengthening the intimate image abuse priorities under the Act is a necessary first step to tackling this, but we expect Ofcom to consider this in its final draft illegal content codes and guidance and to give more information about both the codes of practice and the further measures that would need to be developed to address this issue.
Several noble Lords—the noble Viscount, Lord Camrose, the noble Lord, Lord Clement-Jones, and my noble friend Lord Stevenson—asked for more details on the new offences. As I tried to set out in my opening statement, the Online Safety Act repeals the offence of disclosing private sexual photographs and films with the intent to cause distress—this comes under Section 33 of the Criminal Justice and Courts Act 2015 and is commonly known as the revenge porn offence—and replaces it with four new offences.
First, there is a base offence of sharing an intimate image without consent, which carries a maximum penalty of six months’ imprisonment. Secondly, there are two specific-intent offences—the first is sharing an intimate image with intent to cause alarm, humiliation or distress; the second is sharing an intimate image for the purpose of obtaining sexual gratification—each of which carries a maximum penalty of two years’ imprisonment to reflect the more serious culpability of someone who acts without consent and with an additional malign intent. Lastly, there is an offence of threatening to share an intimate image, with a maximum penalty of two years’ imprisonment. This offence applies regardless of whether the image is shared.
These offences capture images that show, or appear to show, a person who is nude, partially nude, engaged in toileting or doing something sexual. These offences include the sharing of manufactured or manipulated images, which are referred to as deepfakes. This recognises that sharing intimate images without the consent of the person they show or appear to show is sufficiently wrongful or harmful to warrant criminalisation.
The noble Viscount, Lord Camrose, asked what is so different about these new offences compared to those in the Act. I stress that it is because they are being given priority status, which does not sound much but gives considerable extra powers under the Act. There will be new powers and new obligations on platforms. The key thing is that all those offences that already exist are being given priority status under the Online Safety Act. There are thousands of things that Ofcom could address, but this is now in the much smaller list of things that will place very specific obligations on the platforms. Ofcom will monitor this and, as I said earlier, companies can be fined huge sums of money if they do not act, so there is a huge obligation on them to follow through on the priority list.
I hope that I have answered all the questions and that noble Lords agree with me on the importance of updating the priority offences in the Online Safety Act. The noble Viscount, Lord Camrose, asked about parents and made an important point. This is not just about an Act, it is about everybody highlighting the fact that these activities are intolerable and offensive not just to the individuals concerned but to everybody in society, and parents have a responsibility, as we all do, to ensure that media literacy is at the height of the education we carry out formally in schools and informally within the home. The noble Viscount is absolutely right on that, and there is more that we could all do. I commend these regulations to the Committee.
(2 months, 1 week ago)
Lords ChamberThe noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.
What is the Government’s assessment of the technical difficulties behind requiring pornography sites and others to implement age-verification services?