(5 days, 15 hours ago)
Grand CommitteeMy Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.
Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.
My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.
I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.
I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.
I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.
Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.
Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.
Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.
Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.
My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.
I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.
I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.
The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.
Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.
Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.
The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.
On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.
I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.
(1 week ago)
Grand CommitteeI thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.
First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.
Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.
The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.
Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.
On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.
On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.
I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
(1 week, 6 days ago)
Grand CommitteeI start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.
Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.
However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.
On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.
On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.
Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:
“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,
may be a more adaptive solution.
Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.
I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.
Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.
Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?
Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.
My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.
We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.
A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.
Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.
Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.
(2 weeks, 6 days ago)
Grand CommitteeFirst, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.
On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.
In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.
I thank the Minister for that explanation. I see the point she makes that, in existing legislation, these terms are used. I wonder whether there is anything we can do better to explain the terms. There seems to be significant overlap between processors, holders, owners and traders. The more we can do to clarify absolutely, with great rigour, what those terms mean, the more we will bring clarity and simplicity to this necessarily complex body of law.
I thank the Minister for explaining the rationale. I am satisfied that, although it may not be the most elegant outcome, for the time being, in the absence of a change to the 2015 Act that she references, we will probably have to grin and bear it. I beg leave to withdraw the amendment.
My Lords, Amendments 3, 4 and 20 seek to probe the Government’s position on the roles of the Secretary of State and the Treasury. Amendment 6 seeks to probe whether the Treasury or the Secretary of State shall have precedence when making regulations under this Bill.
Clarity over decision-making powers is critical to good governance, in particular over who has final decision rights and in what circumstances. Throughout Part 1 of the Bill, the Secretary of State and the Treasury are both given regulation-making powers, often on the same matter. Our concern is that having two separate Ministers and two departments responsible for making the same regulations is likely to cause problems. What happens if and when the departments have a difference of opinion on what these regulations should contain or achieve? Who is the senior partner in the relationship? When it comes to putting statute on paper, who has the final say, the Secretary of State or the Treasury?
All the amendments are probing and, at this point, simply seek greater clarification from the Government. If the Minister can explain why two departments are jointly responsible for the same regulations, why this is necessary and a good idea, and what provisions will be in place to avoid legislative confusion, I will be happy not to press the amendments.
The amendments in group 2 cover smart data and relate to the Secretary of State and the Treasury. Apart from the financial services sector clauses, most of the powers in Part 1, as well as the statutory spending authority in Clause 13, are imposed on the Secretary of State and the Treasury. That is the point that the noble Viscount made. These allow the relevant government departments to make smart data regulations. Powers are conferred on the Treasury as the department responsible for financial services, given the Government’s commitment to open banking and open financing. There is no precedence between the Secretary of State or the Treasury when using these powers, as regulations are likely to be made by the department responsible for the sector to which the smart data scheme applies, following, as with other regulations, the appropriate cross-government write-round and collective agreement procedures. I add that interdepartmental discussions are overseen by the Smart Data Council, which will give advice on this issue.
The noble Viscount raises concerns relating to Clause 13. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance, as a matter of regularity. It is for these reasons that I urge the noble Viscount not to press these amendments. These are standard procedures where the Treasury is involved and that is why more than one department is referenced.
I thank the Minister for that explanation. I am pleased to hear that these are standard procedures. Will she put that in writing, in a letter to me, explaining and setting it out so that we have it on the record? It is really important to understand where the decisions break down and to have a single point of accountability for all such decisions and, if it cannot be in the Bill, it could at least be explained elsewhere. Otherwise, I am happy to proceed with the explanation that she has kindly given.
I thank my noble friends Lord Lucas and Lord Arbuthnot for their Amendments 5, 34, 48, 200 and 202. They and other noble Lords who have spoken have powerfully raised some crucial issues in these amendments.
Amendment 5 addresses a key gap, and I take on board what my noble friend Lord Markham said, in how we manage and use customer data in specific contexts. At its heart, it seeks to enable effective communication between organisations holding customer data and customers themselves. The ability to communicate directly with individuals in a specified manner is vital for various practical reasons, from regulatory compliance to research purposes.
One clear example of where this amendment would be crucial is in the context of the Student Loans Company. Through this amendment, the Secretary of State could require the SLC to communicate with students for important purposes, such as conducting research into the outcomes of courses funded by loans. For instance, by reaching out to students who have completed their courses, the SLC could gather valuable insights into how those qualifications have impacted on their employment prospects, income levels or career trajectories. This is the kind of research that could help shape future educational policies, ensuring that loan schemes are working as intended and that the investments made in students’ education are yielding tangible benefits. This, in turn, would allow for better decision-making on future student loans funding and educational opportunities.
Amendment 34 from my noble friend Lord Arbuthnot proposes a welcome addition to the existing clause, specifically aiming to ensure that public authorities responsible for ascertaining key personal information about individuals are reliable in their verification processes and provide clear, accurate metadata on that information. This amendment addresses the essential issue of trust and reliability in the digital verification process. We increasingly rely on digital systems to confirm identity, and for these systems to be effective, we have to make sure that the core information they are verifying is accurate and consistent. If individuals’ key identifying details—date of birth, place of birth and, as we heard very powerfully, sex at birth—are not consistently or accurately recorded across various official databases, it undermines the integrity of the digital verification process. It is important that we have consistency across the public authorities listed in this amendment. By assessing whether these bodies are accurately verifying and maintaining this data, we can ensure uniformity in the information they provide. This consistency is essential for establishing a reliable foundation for digital verification.
When we consider the range of public services that rely on personal identification information, from the NHS and His Majesty’s Revenue and Customs to the Home Office, they are all responsible for verifying identity in some capacity. The amendment would ensure that the data they are using is robust, accurate and standardised, creating smoother interactions for individuals seeking public services. It reduces the likelihood of discrepancies that delay or prevent access to public services.
Amendment 48 would introduce important protections for the privacy and integrity of personal information disclosed by public authorities. In our increasingly digital world, data privacy has become one of the most pressing concerns for individuals and for society. By requiring public authorities to attest to the accuracy, integrity and clarity of the data they disclose, the amendment would help to protect the privacy of individuals and ensure that their personal information was handled with the proper care and respect.
My noble friend Lord Lucas’s Amendment 200 would introduce a data dictionary. It would allow the Secretary of State to establish regulations defining key terms used in digital verification services, birth and death registers, and public data more generally. I heard clearly the powerful arguments about sex and gender, but I come at the issue of data dictionaries from the angle of the efficiency, effectiveness and reusability of the data that these systems generate. The more that we have a data dictionary defining the metadata, the more we will benefit from the data used, whichever of these bodies generates the data itself. I am supportive of the requirement to use a data dictionary to provide standardised definitions in order to avoid confusion and ensure that data used in government services is accurate, reliable and consistent. The use of the negative resolution procedure would ensure that Parliament had oversight while allowing for the efficient implementation of these definitions.
Amendment 202 would create a national register for school admissions rules and outcomes in England. This would be a crucial step towards increasing transparency and ensuring fairness in the school admissions process, which affects the lives of millions of families every year. We want to ensure that navigating the school admissions system is not overly opaque and too complex a process for many parents. With different schools following different rules, criteria and procedures, it can, as my noble friend, Lord Lucas, pointed out, be difficult for families to know what to expect or how best to make informed decisions. The uncertainty can be especially challenging for those who are new to the system, those who face language barriers or those in areas where the school’s rules are not readily accessible or clear.
For many parents, particularly those in areas with complex school systems or scarce school places, access to clear, consistent information can make all the difference. This amendment would allow parents to see exactly how the school admissions process works and whether they were likely to secure a place at their preferred school. By laying out the rules in advance, the system would ensure that parents could make better informed decisions about which schools to apply to, based on criteria such as proximity, siblings or academic performance.
We want to ensure that parents understand how decisions are made and whether schools are adhering to the rules fairly. By requiring all schools to publish their admissions rules and the outcomes of their admissions process, the amendment would introduce a level of accountability. I join other noble Lords in strongly supporting this amendment, as it would create a more effective and efficient school admissions system that works for everyone.
My Lords, we have had a good and wide-ranging discussion on all this. I will try to deal with the issues as they were raised.
I thank the noble Lord, Lord Lucas, for the proposed Amendment 5 to Clause 2. I am pleased to confirm that the powers under Clauses 2 and 4 can already be used to provide customer data to customers or third parties authorised by them, and for the publication or disclosure of wider data about the goods or services that the supplier provides. The powers provide flexibility as to when and how the data may be provided or published, which was in part the point that the noble Viscount, Lord Camrose, was making. The powers may also be used to require the collection and retention of specific data, including to require new data to be gathered by data holders so that this data may be made available to customers and third parties specified by regulations.
I note in particular the noble Lord’s interest in the potential uses of these powers for the Student Loans Company. It would be for the Department for Education to consider whether the use of the smart data powers in Part 1 of the Bill may be beneficial in the context of providing information about student loans and to consult appropriately if so, rather than to specify it at this stage in the Bill. I hope the noble Lord will consider those points and how it can best be pursued with that department in mind.
On Amendments 34, 48 and 200, the Government believe that recording, storing and sharing accurate data is essential to deliver services that meet citizens’ needs. Public sector data about sex and gender is collected based on user needs for data and any applicable legislation. As noble Lords have said, definitions and concepts of sex and gender differ.
Amendment 48 would require that any information shared must be accurate, trusted and accompanied by meta data. Depending on the noble Lord’s intentions here, this could either duplicate existing protections under data protection legislation or, potentially, conflict with them and other legal obligations.
The measures in Part 2 of the Bill are intended to secure the reliability of the process by which citizens verify their data. It is not intended to create new ways to determine a person’s sex or gender but rather to allow people to digitally verify the facts about themselves based on documents that already exist. It worries me that, if noble Lords pursued their arguments, we could end up with a passport saying one thing and a digital record saying something different. We have to go back to the original source documents, such as passports and birth certificates, and rely on them for accuracy, which would then feed into the digital record—otherwise, as I say, we could end up pointing in two different directions.
I reassure the noble Lord, Lord Arbuthnot, that my colleague, Minister Clark, is due to meet Sex Matters this week to discuss digital verification services. Obviously, I am happy to encourage that discussion. However, to prescribe where public authorities can usefully verify “sex at birth”, as noble Lords now propose, extends well beyond the scope of the measures in the Bill, so I ask them to reflect on that and whether this is the right place to pursue those issues.
In addition, the Government recently received the final report of the Sullivan review of data, statistics and research on sex and gender, which explores some of these matters in detail. These matters are more appropriately considered holistically—for example, in the context of that report—rather than by a piecemeal approach, which is what is being proposed here. We are currently considering our response to that report. I hope noble Lords will consider that point as they consider their amendments; this is already being debated and considered elsewhere.
Amendment 202 seeks to create a national register of individual school admissions arrangements and outcomes, which can be used to provide information to parents to help them understand their chances of securing a place at their local school. I agree with the noble Lord that choosing a school for their child is one of the most important decisions that a parent can make. That is why admissions authorities are required to publish admission arrangements on their schools’ websites. They must also provide information to enable local authorities to publish an annual admissions prospectus for parents, including admissions arrangements and outcomes for all state schools in their area.
I refer the noble Lord, Lord Lucas, to the School Information (England) Regulations 2008, which require admission authorities and local authorities to publish prescribed information relating to admissions. Those protections are already built into the legislation, and if a local authority is not complying with that, there are ways of pursuing it. We believe that the existing approach is proportionate, reflects the diversity of admissions arrangements and local circumstances, and is not overly burdensome on schools or local authorities, while still enabling parents to have the information they need about their local schools.
I hope that, for all the reasons I have outlined, noble Lords will be prepared not to press their amendments.
My Lords, I am delighted that the Government have chosen to take forward the smart data schemes from the DPDI Bill. The ability seamlessly to harness and use data is worth billions to the UK economy. However, data sharing and the profit that it generates must be balanced against proper oversight.
Let me start by offering strong support to my noble friend Lord Arbuthnot’s Amendment 7. Personally, I would greatly welcome a more sophisticated and widespread insurance market for cyber protections. Such a market would be based on openly shared data; the widespread publication of that data, as set out in the amendment, could help to bring this about.
I also support in principle Amendments 8 and 10 in the name of the noble Lord, Lord Clement-Jones, because, as I set out on the previous group, there is real and inherent value in interoperability. However, I wonder whether the noble Lord might reconsider the term “machine readable” and change it to something— I do not think that I have solved it—a bit more like “digitally interoperable”. I just worry that, in practice, everything is machine-readable today and the term might become obsolete. I am keen to hear the Minister’s response to his very interesting Amendment 31 on the compulsion of any person to provide data.
I turn to the amendments in my name. Amendment 16 would insert an appeals mechanism by which a person is charged a fee under subsection (1). It is quite reasonable that persons listed under subsection (2)—that is, data holders, decision-makers, interface bodies, enforcers and others with duties or powers under these regulations —may charge a fee for the purposes of meeting the expenses they incur, performing duties or exercising powers imposed by regulations made under this part. However, there should be an appeals mechanism so that, in the event that a person is charged an unreasonable fee, they have a means of recourse.
Amendment 17 is a probing amendment intended to explore the rate at which interest accrues on money owed to specific public authorities for unpaid levies. Given that this interest will be mandated by law, do the Government intend to monitor the levels and, if so, how?
Amendment 18 is a probing amendment designed to explore how the Government intend to deal with a situation when a person listed under subsection (2) of this clause believes they have been charged a levy wrongly. Again, it is reasonable that an appeals mechanism be created, and this would ensure that those who considered themselves to have been wrongly charged have a means of recourse.
Amendment 19 is looking for clarification on how the Government envisage unpaid levies being recovered. I would be grateful if the Minister could set out some further detail on that matter.
Amendment 21 is a probing amendment. I am curious to know the maximum value of financial assistance that the Government would allow the Secretary of State or the Treasury to give to persons under Clause 13. I do not think it would be prudent for the Government to become a financial backstop for participants in smart data schemes, so on what basis is that maximum going to be calculated?
Amendment 22 follows on from those concerns and looks to ensure that there is parliamentary oversight of any assistance provided. I am most curious to hear the Minister’s comments on this matter.
Amendment 23 is a straightforward—I think—amendment to the wording. I feel that the phrase “reasonably possible” seems to open the door to almost limitless endeavours and therefore suggest replacing it with “reasonably practicable”.
On Amendment 25, easy access to the FCA’s policy regarding penalties and levies is important. That would allow oversight, not only parliamentary but by those who are directly or indirectly affected by decisions taken under this policy. I therefore believe the amendment is necessary, as a website is the most accessible location for that information. Furthermore, regular review is necessary to ensure that the policy is functioning and serving its purpose.
Amendments 26 and 27 return to the matter of an appeals process. I will not repeat myself too much, but it is important to be able to appeal penalties and to create a route by which individuals understand how they can go about doing so.
Amendment 28 would ensure that, when the Secretary of State and the Treasury review the regulations made under Part 1 of the Bill, they do so concurrently. This amendment would prevent separate reviews being conducted that may contradict each other or be published at different times; it would force the relevant departments to produce one review and to produce it together. This would be prudent. It would prevent the Government doing the same work twice, unnecessarily spending public money, and would prevent contradicting reviews, which may cause confusion and financial costs in the smart data scheme industry.
Lastly, Amendment 29, which would ensure that Section 10 of this part was subject to the affirmative procedure, would allow for parliamentary oversight of regulations made under this clause.
We are pleased that the Government have chosen to bring smart data schemes forward, but I hope the Minister can take my concerns on board and share with us some of the detail in her response.
My Lords, we have had a detailed discussion, and it may be that I will not be able to pick up all the points that noble Lords have raised. If I do not, I guarantee to write to people.
First, I want to pick up the issues raised by the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, about cybersecurity and cyber resilience. This Government, like previous Governments, take this issue hugely seriously. It is built into all our thinking. The noble Lord, and the noble Baroness in particular, will know that the advice we get on all these issues is top class. The Government are already committed to producing a cybersecurity and resilience Bill within this Parliament. We have all these things in hand, and that will underpin a lot of the protections that we are going to have in this Bill and others. I agree with noble Lords that this is a hugely important issue.
I am pleased to confirm that Clause 3(7) allows the regulations to impose requirements on third-party recipients in relation to the processing of data, which will include security-related requirements. So it is already in the Bill, but I assure noble Lords that it will be underpinned, as I say, by other legislation that we are bringing forward.
In relation to Amendments 8 and 10, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provision about the providing or publishing of business data and the format in which that must be provided. That may include relevant energy-related data. The noble Lord gave some very good examples about how useful those connections and that data could be; he was quite right to raise those issues.
Regarding Amendment 9, in the name of the noble Lord, Lord Clement-Jones, I am pleased to confirm that there is nothing to prevent regulations requiring the provision of business data to government departments, publicly owned bodies and local and regional authorities. This is possible through Clause 4(1)(b), which allows regulations to require provision of business data to a person of a specified description. I hope the noble Lord will look at those cross-references and be satisfied by them.
Noble Lords spoke about the importance of sensitive information in future smart data schemes. A smart data scheme about legal services is not currently under consideration. Having said that, the Government would have regard to the appropriateness of such a scheme and the nature of any data involved and would consult the sector and any other appropriate stakeholders if that was being considered. It is not at the top of our list of priorities, but the noble Lord might be able to persuade us that it would have some merit, and we could start a consultation based on that.
Amendments 16 to 22 consider fees and the safeguards applying to them, which were raised by the noble Viscount. Fees and levies, enabled by Clauses 11 and 12, are an essential mechanism to fund a smart data scheme. The Government consider that appropriate and proportionate statutory safeguards are already built in. For example, requirements in Clause 11(3) and Clause 12(2) circumscribe the expenses in relation to which fees or the levy may be charged, and the persons on whom they may be charged.
Capping the interest rate for unpaid money, which is one of the noble Viscount’s proposals, would leave a significant risk of circumstances in which it might be financially advantageous to pay the levy late. The Government anticipate that regulations would provide an appropriate mechanism to ensure payment of an amount that is reasonable in the context of a late payment that is proposed. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance as a matter of regularity.
Amendments 23 to 27 deal with the clauses relating to the FCA. Clause 15(3) is drafted to be consistent with the wording of established legislation which confers powers on the FCA, most notably the Financial Services and Markets Act 2000. Section 1B of that Act uses the same formulation, using the phrase
“so far as is reasonably possible”
in relation to the FCA’s general duties. This wording is established and well understood by both the FCA and the financial services sector as it applies to the FCA’s strategic and operational objectives. Any deviation from it could create uncertainty and inconsistency.
Amendment 24 would cause significant disruption to current data-sharing arrangements and fintech businesses. Reauthenticating this frequently with every data holder would add considerable friction to open banking services and greatly reduce the user experience—which was the point raised by the noble Lord, Lord Clement-Jones. For example, it is in the customer’s interest to give ongoing consent to a fintech app to provide them with real-time financial advice that might adapt to daily changes in their finances.
Many SMEs provide ongoing access to their bank accounts in order to receive efficient cloud accounting services. If they had to re-register frequently, that would undermine the basis and operability of some of those services. It could inhibit the adoption and viability of open banking, which would defeat one of the main purposes of the Bill.
I thank noble Lords for their comments and contributions in what has been an absolutely fascinating debate. I have a couple of points to make.
I agree with the noble Lord, Lord Clement-Jones, on his Amendment 33, on ongoing monitoring, and his Amendment 50. Where we part company, I think, is on his Amendment 36. I feel that we will never agree about the effectiveness or otherwise of five-year strategies, particularly in the digital space. I simply do not buy that his amendment will have the desirable effects that the noble Lord wants.
I do not necessarily agree with the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we should put extra burdens around the right to use non-digital methods. In my opinion, and I very much look forward to hearing from the Minister on this matter, the Act preserves that right quite well as it is. I look forward to the Government’s comments on that.
I strongly support the noble Viscount, Lord Colville, on his very important point about international standards. I had intended to sign his amendment but I am afraid that, for some administrative reason, that did not happen. I apologise for that, but I will sign it because I think that it is so important. In my opinion, not much of the Bill works in the absence of effective international collaboration around these matters. This is so important. We are particularly going to run up against this issue when we start talking about ADM, AI and copyright issues. It is international standards that will allow us to enforce any of the provisions that we put in here, so they are so important. I am more agnostic on whether this will happen via W3C, the ITU or other international standards bodies, but we really must go forward with the principle that international standards are what will get us over the line here. I look forward to hearing the Minister’s confirmation of the importance, in the Government’s view, of such standards.
Let me turn to the amendments listed in my name. Amendment 37 would ensure parliamentary oversight of the DVS trust framework. Given the volume of sensitive data that these services providers will be handling, it is so important that Parliament can keep an eye on how the framework operates. I thank noble Lords for supporting this amendment.
Amendment 40 is a probing amendment. To that end, I look forward to hearing the Minister’s response. Accredited conformity assessment bodies are charged with assessing whether a service complies with the DVS framework. As such, they are giving a stamp of approval from which customers will draw a sense of security. Therefore, the independence of these accreditation bodies must be guaranteed. Failing to do so would allow the industry to regulate itself. Can the Minister set out how the Government will guarantee the independence of these accreditation bodies?
Amendment 49 is also a probing amendment. It is designed to explore the cybersecurity measures that the Government expect of digital verification services. Given the large volume of data that these services will be handling, it is essential that the Government demand substantial cybersecurity measures. This is a theme that we are going to come back to again and again; we heard about it earlier, and I think that we will come on to more of this. As these services become more useful and more powerful, they present a bigger attack surface that we have to defend, and I look forward to hearing how we will do that.
I thank the noble Lords, Lord Clement-Jones and Lord Markham, the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for raising these topics around digital verification services. As I explained at Second Reading, these digital verification services already exist. They are already out there making all sorts of claims for themselves. With the new trust framework, we are trying to provide some more statutory regulation of the way that they operate. It is important that we have this debate and that we get it right, but some of the things we are doing are still work in progress, which is why we do not always have all the detailed answers that noble Lords are searching for here and why some powers have been left to the Secretary of State.
I shall go from the top through the points that have been raised. Amendments 33 and 43, tabled by the noble Lord, Lord Clement-Jones, and Amendment 40 tabled by the noble Viscount, Lord Colville, would require the trust framework to include rules on monitoring compliance and redress mechanisms and would require the Secretary of State to ensure the independence of accredited conformity assessment bodies. The noble Baroness, Lady Kidron, asked questions akin to those regarding redress for the vulnerable, and I will write to her setting out a response to that in more detail.
On the issue of redress mechanisms in the round, the scope of the trust framework document is solely focused on the rules that providers of digital verification services are required to follow. It does not include matters of governance. Compliance is ensured via a robust certification process where services are assessed against the trust framework rules. They are assessed by independent conformity assessment bodies accredited by the United Kingdom Accreditation Service, so some oversight is already being built into this model.
The Bill contains powers for the Secretary of State to refuse applications to the DVS register or to remove providers where he is satisfied that the provider has failed to comply with the trust framework or if he considers it necessary in the interests of national security. These powers are intended as a safety net, for example, to account for situations where the Secretary of State might have access to intelligence sources that independent conformity assessment bodies cannot assess and therefore will not be able to react to, or it could be that a particular failure of the security of one of these trust marks comes to light very quickly, and we want to act very quickly against it. That is why the Secretary of State has those powers to be able to react quickly in what might be a national security situation or some other potential leak of important data and so on.
In addition, conformity assessment bodies carry out annual surveillance audits and can choose to conduct spot audits on certified providers, and they have the power to withdraw certification where non-conformities are found. Adding rules on compliance would cut across that independent certification process and would be outside the scope of the trust framework. Those independent certification processes already exist.
Amendments 33, 41, 42, 44 and 45 tabled by the noble Lord, Lord Clement-Jones, would in effect require the creation of an independent appeals body to adjudicate on the refusal of an application to the DVS register and the implementation of an investigatory process applicable to refusal and removal from the DVS register. The powers of the Secretary of State in this regard are not without safeguards. They may be exercised only in limited circumstances after the completion of an investigatory process and are subject to public law principles, for example, reasonableness. They may also be challenged by judicial review.
To go back to the point I was making, it might be something where we would need to move quickly. Rather than having a convoluted appeals process in the way that the noble Lord was talking about, I hope he understands the need sometimes for that flexibility. The creation and funding of an independent body to adjudicate such a limited power would therefore be inappropriate.
It would be reassuring if the Minister could share with us some of the meetings that the Secretary of State or Ministers are having with those bodies on the subject of these internationally shared technical standards.
I might need to write to the noble Viscount, but I am pretty sure that that is happening at an official level on a fairly regular basis. The noble Viscount raises an important point. I reassure him that those discussions are ongoing, and we have huge respect for those international organisations. I will put the detail of that in writing to him.
I turn to Amendment 37, tabled by the noble Viscount, Lord Camrose, which would require the DVS trust framework to be laid before Parliament. The trust framework contains auditable rules to be followed by registered providers of digital verification services. The rules, published in their third non-statutory iteration last week on GOV.UK, draw on and often signpost existing technical requirements, standards, best practice, guidance and legislation. It is a hugely technical document, and I am not sure that Parliament would make a great deal of sense of it if it was put forward in its current format. However, the Bill places consultation on a statutory footing, ensuring that it must take place when the trust framework is being prepared and reviewed.
Amendments 36 and 38, tabled by the noble Lord, Lord Clement-Jones, would create an obligation for the Secretary of State to reconsult and publish a five-year strategy on digital verification services. It is important to ensure that the Government have a coherent strategy for enabling the digital verification services market. That is why we have already consulted publicly on these measures, and we continue to work with experts. However, given the nascency of the digital identity market and the pace of those technological developments, as the noble Viscount, Lord Camrose, said, forecasting five years into the future is not practical at this stage. We will welcome scrutiny through the publication of the annual report, which we are committed to publishing, as required by Clause 53. This report will support transparency through the provision of information, including performance data regarding the operation of Part 2.
Amendment 39, also tabled by the noble Lord, Lord Clement-Jones, proposes to exclude certified public bodies from registering to provide digital verification services. We believe that such an exclusion could lead to unnecessary restrictions on the UK’s young digital verification market. The noble Lord mentioned the GOV.UK One Login programme, which is aligned with the standards of the trust framework but is a separate government programme which gives people a single sign-on service to access public services. It uses different legal powers to operate its services from what is being proposed here. We do not accept that we need to exclude public bodies from the scrutiny that would otherwise take place.
Amendment 46 seeks to create a duty for organisations that require verification and use digital verification for that purpose to offer, where reasonably practicable, a non-digital route and ensure that individuals are made aware of both options for verification. I should stress here that the provision in the Bill relates to the provision of digital verification services, not requirements on businesses in general about how they conduct verification checks.
Ensuring digital inclusion is a priority for this Government, which is why we have set up the digital inclusion and skills unit within DSIT. Furthermore, there are already legislative protections in the Equality Act 2010 in respect of protected groups, and the Government will take action in the future if evidence emerges that people are being excluded from essential products and services by being unable to use digital routes for proving their identity or eligibility.
The Government will publish a code of practice for disclosure of information, subject to parliamentary review, highlighting best practice and relevant information to be considered when sharing information. As for Amendment 49, the Government intend to update this code only when required, so an annual review process would not be necessary. I stress to the Committee that digital verification services are not going to be mandatory. It is entirely voluntary for businesses to use them, so it is up to individuals whether they use that service or not. I think people are feeling that it is going to be imposed on people, and I would push against that proposal.
If the regulation-making power in Amendment 50 proposed by the noble Lord, Lord Clement-Jones, was used, it would place obligations on the Information Commissioner to monitor the volume of verification checks being made, using the permissive powers to disclose information created in the clause. The role of the commissioner is to regulate data protection in the UK, which already includes monitoring and promoting responsible data-sharing by public authorities. For the reasons set out above, I hope that noble Lords will feel comfortable in not pressing their amendments.
I support that. I completely agree with all the points that the noble Lord, Lord Clement-Jones, made on the previous groupings, but the one that we all agree is absolutely vital is the one just brought up by my noble friend. Coming from the private sector, I am all in favour of a market—I think that it is the right way to go—but standards within that are equally vital.
I come at this issue having had the misfortune of having to manage the cyberattack that we all recall happening against our diagnostic services in hospitals last summer. We found that the weakest link there was through the private sector supplier to that system, and it became clear that the health service—or cybersecurity, or whoever it was—had not done enough to make sure that those standards were set, published and adhered to effectively.
With that in mind, and trying to learn the lessons from it, I think that this clause is vital in terms of its intent, but it will be valuable only if it is updated on a frequent basis. In terms of everything that we have spoken about today, and on this issue in particular, I feel that that point is probably the most important. Although everything that we are trying to do is a massive advance in terms of trying to get the data economy to work even better, I cannot emphasise enough how worrying that attack on our hospitals last summer was at the time.
I thank both noble Lords for raising this; I absolutely concur with them on how important it is. In fact, I remember going to see the noble Viscount, Lord Camrose, when he was in his other role, to talk about exactly this issue: whether the digital verification services were going to be robust enough against cyberattacks.
I pray in aid the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, who both felt that the new Cyber Security and Resilience Bill will provide some underpinning for all of this, because our Government take this issue very seriously. As the Committee can imagine, we get regular advice from the security services about what is going on and what we need to do to head it off. Yes, it is a difficult issue, but we are doing everything we can to make sure that our data is safe; that is fundamental.
Amendment 47 would require the Secretary of State to prepare and publish rules on cybersecurity for providers to follow. The existing trust framework includes rules on cybersecurity, against which organisations will be certified. Specifically, providers will be able to prove either that they meet the internationally recognised information security standards or that they have a security management system that matches the criteria set out in the trust framework.
I assure noble Lords that the Information Commissioner’s Office, the National Cyber Security Centre and other privacy stakeholders have contributed to the development of the trust framework. This includes meeting international best practice around encryption and cryptology techniques. I will happily write to noble Lords to reassure them further by detailing the range of protections already in place. Alternatively, if noble Lords here today would benefit from an official technical briefing on the trust framework, we would be delighted to set up such a meeting because it is important that we all feel content that this will be a robust system, for exactly the reasons that the noble Lord, Lord Markham, explained. We are absolutely on your Lordships’ side and on the case on all this; if it would be helpful to have a meeting, we will certainly do that.
I thank the Minister and my noble friend Lord Markham for those comprehensive and welcome comments. I would certainly like to take up the Minister’s offer of a technical briefing on the trust framework; that really is extremely important.
To go briefly off-piste, one sign that we are doing this properly will be the further development of an insurance marketplace for cybersecurity. It exists but is not very developed at the moment. As and when this information is regularly published and updated, we will see products becoming available that allow people to take insurance based on known risks around cybersecurity.
As I say, I take comfort from the Minister’s words and look forward to attending the tech briefing. When it comes, the cyber Bill will also play a serious role in this space and I look forward to seeing how, specifically, it will interact with DVS and the other services that we have been discussing and will continue to discuss. I beg leave to withdraw my amendment.
My Lords, I will address the amendments proposed by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron. I have nothing but the deepest respect for their diligence, and indeed wisdom, in scrutinising all three flavours of the Bill as it has come out, and for their commitment to strengthening the legislative framework against fraud and other misuse of digital systems. However, I have serious reservations about the necessity and proportionality of the amendments under consideration, although I look forward to further debates and I am certainly open to being convinced.
Amendments 51 and 52 would introduce criminal sanctions, including imprisonment, for the misuse of trust marks. While the protection of trust marks is vital for maintaining public confidence in digital systems, I am concerned that introducing custodial sentences for these offences risks overcriminalisation. The misuse of trust marks can and should be addressed through robust civil enforcement mechanisms. Turning every such transgression into a criminal matter would place unnecessary burdens on, frankly, an already strained justice system and risks disproportionately punishing individuals or small businesses for inadvertent breaches.
Furthermore, the amendment’s stipulation that proceedings could be brought only by or with the consent of the Director of Public Prosecutions or the Secretary of State is an important safeguard, yet it underscores the high level of discretion required to enforce these provisions effectively, highlighting the unsuitability of broad criminalisation in this context.
Amendment 53 seeks to expand the definition of identity documents under the Identity Documents Act 2010 to include digital identity documents. While the noble Lord, Lord Clement-Jones, makes a persuasive case, the proposal raises two concerns. First, it risks pre-emptively criminalising actions before a clear and universally understood framework for digital identity verification is in place. The technology and its standards are still evolving, and it might be premature to embed such a framework into criminal law. Secondly, there is a risk that this could have unintended consequences for innovation in the digital identity sector. Businesses and individuals navigating this nascent space could face disproportionate legal risks, which may hinder progress in a field critical to the UK’s digital economy.
Amendment 54 would introduce an offence of knowingly or recklessly providing false information in response to notices under Clause 51. I fully support holding individuals accountable for deliberate deception, but the proposed measure’s scope could lead to serious ambiguities. What constitutes recklessness in this context? Are we inadvertently creating a chilling effect where individuals or businesses may refrain from engaging with the system for fear of misinterpretation or error? These are questions that need to be addressed before such provisions are enshrined in law.
We must ensure that our legislative framework is fit for purpose, upholds the principles of justice and balances enforcement with fairness. The amendments proposed, while they clearly have exactly the right intentions, risk, I fear, undermining these principles. They introduce unnecessary criminal sanctions, create uncertainty in the digital identity space and could discourage good-faith engagement with the regulatory system. I therefore urge noble Lords to carefully consider the potential consequences of these amendments and, while expressing gratitude to the noble Lords for their work, I resist their inclusion in the Bill.
My Lords, of course we want to take trust seriously. I could not agree more that the whole set of proposals is predicated on that. Noble Lords have all made the point, in different ways, that if there is not that level of trust then people simply will not use the services and we will not be able to make progress. We absolutely understand the vital importance of all that. I thank all noble Lords for their contributions on this and I recognise their desire to ensure that fraudulent use of the trust mark is taken seriously, as set out in Amendments 51 and 52.
The trust mark is in the process of being registered as a trademark in the UK. As such, once that is done, the Secretary of State will be able to take appropriate legal action for misuse of it. Robust legal protections are also provided through Clause 50, through the trademark protections, and through other existing legislative provisions, such as the Consumer Protection from Unfair Trading Regulations 2008. There is already legislation that underpins the use of that trust mark. Additionally, each trust mark will have a unique number that allows users to check that it is genuine. These amendments would duplicate those existing protections.
In seeking to make the misuse of a digital identity a criminal offence, which Amendments 53 and 209 attempt to do, the noble Lord offered me several different ways of approaching this, so I will offer him some back. The behaviour he is targeting is already addressed in the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018. We would argue that it is already by existing legislation.
On the noble Lord’s point about the Identity Documents Act 2010, defining every instance of verification as an identity document within the scope of offences in that Act could create an unclear, complicated and duplicative process for the prosecution of digital identity theft. The provision of digital verification services does not always create one single comprehensive identity proof—I think this is the point that the noble Viscount, Lord Camrose, was making. People use it in different ways. It might be a yes/no check to ensure that a person is over 18, or it might be a digital verification services provider providing several derived credentials that can be used in different combinations for different use cases. We have to be flexible enough to be able to deal with that and not just make one fraudulent act. It would not be appropriate to add digital identity to the list of documents set out in the Identity Documents Act.
Amendment 54 would create an offence of supplying false information to the Secretary of State, but sanctions already exist in this situation, as the organisation can be removed from the DVS register via the power in Clause 41. Similarly, contractual arrangements between the Office for Digital Identities and Attributes and conformity assessment bodies require them to adhere to the principle of truthfulness and accuracy. To create a new offence would be disproportionate when safeguards already exist. I take on board the intent and aims of the noble Lord, Lord Clement-Jones, but argue that there are already sufficient protections in current law and in the way in which the Bill is drafted to provide the reassurance that he seeks. Therefore, I hope that he feels comfortable in not pressing his amendment.
My Lords, I am confident that, somewhere, there is a moral philosopher and legal scholar who can explain why this amendment is not part of the next group on NUAR but, in the meantime, my amendment addresses a fundamental issue. It would ensure that strict security measures are in place before any individual or organisation is allowed access to the sensitive information held on the National Underground Asset Register. The NUAR is a crucial tool for managing the UK’s underground infrastructure. It holds critical data about pipelines, cables and other assets that underpin vital services such as water, energy, telecommunications and transport.
This information, while essential for managing and maintaining infrastructure, is also a potential target for misuse. As such, ensuring the security of this data is not just important but vital for the safety and security of our nation. The information contained in the NUAR is sensitive. Its misuse could have disastrous consequences. If this data were to fall into the wrong hands, whether through criminal activities, cyberattacks or terrorism, it could be exploited to disrupt or damage critical infrastructure. I know that the Government take these risks seriously but this amendment seeks to address them further by ensuring that only those with a legitimate need, who have been properly vetted and who have met specific security requirements can access this data. We must ensure that the people accessing this register are trusted individuals or organisations that understand the gravity of handling this sensitive information and are fully aware of the risks involved.
The amendment would ensure that we have a framework for security—one that demands that the Secretary of State introduces clear, enforceable regulations specifying the security measures that must be in place before anyone can access the NUAR. These measures may include: background checks to ensure that those seeking access are trustworthy and legitimate; cybersecurity safeguards to prevent unauthorised digital access or breaches; physical security measures to protect the infrastructure where this information is stored; and clear guidelines on who should be allowed access and the conditions under which they can view this sensitive data.
The potential threats posed by unsecured access to the NUAR cannot be overstated. Criminals could exploit this information to target and disrupt key infrastructure systems. Terrorist organisations could use it to plan attacks on essential services, endangering lives and causing mass disruption. The stakes are incredibly high; I am sure that I do not need to convince noble Lords of that. In an era where digital and physical infrastructure are increasingly interconnected, the risks associated with unsecured access to information of the kind held in the NUAR are growing every day. This amendment would address this concern head on by requiring that we implement safeguards that are both thorough and resilient to these evolving threats. Of course, the cyber Bill is coming, but I wonder whether we need something NUAR-specific and, if so, whether we need it in this Bill. I beg to move.
I thank the noble Viscount for raising the issue of the National Underground Asset Register’s cybersecurity. As he said, Amendment 55 seeks to require more detail on the security measures in the regulations that will be applied to the accessing of NUAR data.
The noble Viscount is right: it is absolutely fundamental that NUAR data is protected, for all the reasons he outlined. It hosts extremely sensitive data. It is, of course, supported by a suite of sophisticated security measures, which ensure that the very prescribed users’ access to data is proportionate. I hope that the noble Viscount understands that we do not necessarily want to spell out what all those security measures are at this point; he will know well enough the sorts of discussions and provisions that go on behind the scenes.
Security stakeholders, including the National Cyber Security Centre and the National Protective Security Authority, have been involved in NUAR’s development and are members of its security governance board, which is a specific governance board overseeing its protection. As I say, access to it occurs on a very tight basis. No one can just ask for access to the whole of the UK’s data on NUAR; it simply is not geared up to be operated in that way.
We are concerned that the blanket provision proposed in the amendment would lead to the publication of detailed security postures, exposing arrangements that are not public knowledge. It could also curtail the Government’s ability to adapt security measures when needed and, with support from security stakeholders, to accommodate changing circumstances—or, indeed, changing threats—that we become aware of. We absolutely understand why the noble Viscount wants that reassurance. I can assure him that it is absolutely the best security system we could possibly provide, and that it will be regularly scrutinised and updated; I really hope that the noble Viscount can take that assurance and withdraw his amendment.
I thank the Minister for that answer. Of course, I take the point that to publish the security arrangements is somehow to advertise them, but I am somehow not yet altogether reassured. I wonder whether there is something that we can push further as part of a belt-and-braces approach to the NUAR security arrangements. We have talked about cybersecurity a lot this afternoon. All of these things tend to create additional incentives towards cyberattacks —if anything, NUAR does so the most.
If it helps a little, I would be very happy to write to the noble Viscount on this matter.
Yes, that would be great. I thank the Minister. I beg leave to withdraw my amendment.
My Lords, there is a great deal to be gained from digitising the registers of births, stillbirths and deaths. Not only does it reduce the number of physical documents that need to be maintained and kept secure but it means that people do not have to physically sign the register of births or deaths in the presence of a registrar. This will make people’s lives a great deal easier during those stressful periods of their lives.
However, digitising all this data—I am rather repeating arguments I made about NUAR and other things earlier—creates a much larger attack surface for people looking to steal personal data. This amendment explores how the Government will protect this data from malign actors. If the Minister could provide further detail on this, I would be most grateful.
This is a probing amendment and has been tabled in a constructive spirit. I know that we all want to harness the power of data and tech in this space and use it to benefit people’s lives but, particularly with this most personal of data, we have to take appropriate steps to keep it secure. Should there be a data breach, hackers would have access to an enormous quantity of personal data. Therefore, I suggest that, regardless of how much thought the Government have given this point up to now, the digitisation of these registers should not occur until substantial cybersecurity measures are in place. I look forward to the Minister’s comments.
On Amendment 57, legislation is already in place to ensure the security of electronic registers. Articles 25 and 32 of the UK General Data Protection Regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures, so this already applies.
The electronic system has been in place for births and deaths since 2009, and all events have been registered electronically since that date, in parallel with the paper registers and with no loss of data. What is happening with this legislation is that people do not have to keep paper records anymore; it is about the existing electronic system. The noble Lord will remember that it is up to registrars even so, but I think that the idea is that they will no longer have to keep the paper registers as well, which everybody felt was an unnecessary administrative burden.
Nevertheless, the system is subject to Home Office security regulations, and robust measures are in place to protect the data. There has been no loss of data or hacking of that data up to now. Obviously, we need to make sure that the security is kept up to date, but we think that it is a pretty robust system. It is the paper documents that are losing out here.
I thank the Minister. I take the point that this has been ongoing for a while and that, in fact, the security is better because there is less reliance on the paper documents. That said, I am encouraged by her answer and encouraged that the Government continue to anticipate this growing risk and act accordingly. On that basis, I withdraw the amendment.
(1 month, 3 weeks ago)
Lords ChamberWe are acutely aware of this issue. We know that there is a live ongoing argument about it and we are talking to our colleagues across government to find a way through, but we have not come to a settled view yet.
My Lords, catfishing is, of course, one of the misuses of technology in respect of which AI is rapidly enhancing both the attack and the defence. Does the Minister agree that the most effective, adaptive and future-proof defence against catfishing is actually personal awareness and resilience? If so, can the Minister provide a bit more of an update on the progress made in implementing this crucial media literacy strategy, which will be such an important part of defending us all against these attacks in future?
Ofcom published its latest vision of the media literacy strategy just a couple of months ago, so its implementation is very much in its infancy. The Government very much support it and we will work with Ofcom very closely to roll it out. So Ofcom has a comprehensive media literacy strategy on these issues, but as we all know, schools have to play their part as well: it has to be part of the curriculum. We need to make sure that children are kept safe in that way.
The noble Viscount referred to AI. The rules we have—the Online Safety Act and so on—are tech-neutral in the sense that, even if an image is AI generated, it would still fall foul of that Act; it does not matter whether it is real or someone has created it. Also, action should be taken by the social media companies to take down those images.
(1 month, 3 weeks ago)
Grand CommitteeMy Lords, I thank the Minister for setting out this instrument so clearly. It certainly seems to make the necessary relatively simple adjustments to fill an important gap that has been identified. Although I have some questions, I will keep my remarks fairly brief.
I will reflect on the growing importance of both the Online Safety Act and the duty we have placed on Ofcom’s shoulders. The points made by the noble Lord, Lord Clement-Jones, about the long-standing consequential nature of the creation of Ofcom and the Communications Act were well made in this respect. The necessary complexity and scope of the work of Ofcom, as our online regulator, has far outgrown what I imagine was foreseeable at the time of its creation. We have given it the tasks of developing and enforcing safety standards, as well as issuing guidance and codes of practice that digital services must follow to comply with the Act. Its role includes risk assessment, compliance, monitoring and enforcement, which can of course include issuing fines or mandating changes to how services operate. Its regulatory powers now allow it to respond to emerging online risks, helping to ensure that user-protection measures keep pace with changes in the digital landscape.
In recognising the daily growing risk of online dangers and the consequent burdens on Ofcom, we of course support any measures that bring clarity and simplicity. If left unaddressed, the identified gap here clearly could lead to regulatory inefficiencies and delays in crucial processes that depend on accurate and up-to-date information. For example, setting appropriate fee thresholds for regulated entities requires detailed knowledge of platform compliance and associated risks, which would be challenging to achieve without full data access. During post-implementation reviews, a lack of access to necessary business information could hamper the ability to assess whether the Act is effectively achieving its safety objectives or whether adjustments are needed.
That said, I have some questions, and I hope that, when she rises, the Minister will set out the Government’s thinking on them. My first question very much picks up on the point made—much better than I did—by the noble Lord, Lord Stevenson of Balmacara. It is important to ensure that this instrument does not grant unrestricted access to business information but, rather, limits sharing to specific instances where it is genuinely necessary for the Secretary of State to fulfil their duties under the Act. How will the Government ensure this?
Secondly, safeguards, such as data protection laws and confidentiality obligations under the Communications Act 2003, must be in place to guarantee that any shared information is handled responsibly and securely. Do the Government believe that sufficient safeguards are already in place?
Thirdly, in an environment of rapid technology change, how do the Government plan to keep online safety regulation resilient and adaptive? I look forward to hearing the Government’s views on these questions, but, as I say, we completely welcome any measure that increases clarity and simplicity and makes it easier for Ofcom to be effective.
I thank noble Lords for their valuable contributions to this debate. It goes without saying that the Government are committed to the effective implementation of the Online Safety Act. It is critical that we remove any barriers to that, as we are doing with this statutory instrument.
As noble Lords said—the noble Viscount, Lord Camrose, stressed this—the Online Safety Act has taken on a growing significance in the breadth and depth of its reach. It is very much seen as an important vehicle for delivering the change that the whole of society wants now. It is important that we get this piece of legislation right. For that purpose, this statutory instrument will ensure that Ofcom can co-operate and share online safety information with the Secretary of State where it is appropriate to do so, as was intended during the Act’s development.
On specific questions, all three noble Lords who spoke asked whether the examples given were exclusive or whether there are other areas where powers might be given to the Secretary of State. The examples given are the two areas that are integral to implementation. We have not at this stage identified any further areas. The instrument would change to allow sharing only for the purposes of fulfilling the Secretary of State’s functions under the Online Safety Act—it does not go any broader than that. I think that answers the question asked by the noble Viscount, Lord Camrose, about whether this meant unlimited access—I assure him that that is not the purpose of this SI.
My noble friend Lord Stevenson asked whether this relates only to the powers under the OSA. Yes, the instrument allows Ofcom to share information it has collected from businesses only for the purposes of fulfilling the Secretary of State’s functions under the Act.
On the question of devolution, the powers of Scottish, Northern Ireland and Welsh Ministers primarily relate to the power to define the educational establishments for the purpose of Schedule 1 exemptions. There are also some consultation provisions where these Ministers must be consulted, but that is the limit of the powers that those Ministers would have.
I am conscious that I have not answered all the questions asked by the noble Viscount, Lord Camrose, because I could not write that quickly—but I assure him that my officials have made a note of them and, if I have not covered those issues, I will write to him.
I hope that noble Lords agree with me on the importance of implementing the Online Safety Act and ensuring that it can become fully operational as soon as possible. I commend these regulations to the Committee.
(1 month, 3 weeks ago)
Grand CommitteeMy Lords, I shall also start on a positive note and welcome the ongoing focus on online safety. We all aim to make this the safest country in the world in which to be online. The Online Safety Act is the cornerstone of how all of us will continue to pursue this crucial goal. The Act imposed clear legal responsibilities on social media platforms and tech companies, requiring them actively to monitor and manage the content they host. They are required swiftly to remove illegal content and to take proactive measures to prevent harmful material reaching minors. This reflects the deep commitment that we all share to safeguarding children from the dangers of cyberbullying, explicit content and other online threats.
We must also take particular account of the disproportionate harm that women and girls face online. The trends regarding the online abuse and exploitation that disproportionately affect female users are deeply concerning. Addressing these specific challenges is essential if we are to create a truly safe online environment for everyone.
With respect to the Government’s proposed approach to making sharing intimate images without consent a priority offence under the Online Safety Act, this initiative will require social media companies promptly to remove such content from their platforms. This aims to curb the rise in abuse that has been described as “intolerable”—I think rightly—by the Secretary of State. The intent behind this measure is to prevent generations becoming “desensitised” to the devastating effects of online abuse.
Although this appears to signal a strong stance against online harm, it raises the question of what this designation truly accomplishes in practical terms. I am grateful to the Minister for setting this out so clearly. I am not entirely sure that I altogether followed the differences between the old offences and the new ones. Sharing intimate images without consent is already illegal under current laws. Therefore, can we not say that the real issue lies in the absence not of legal provision but of effective enforcement of existing regulation? We have to ensure that any changes we make do not merely add layers of complexity but genuinely strengthen the protections available to victims and improve the responsiveness of platforms in removing harmful content.
With these thoughts in mind, I offer five questions. I apologise; the Minister is welcome to write as necessary, but I welcome her views whether now or in writing. First, why is it necessary to add the sharing of intimate images to the list of priority offences if such acts are already illegal under existing legislation and, specifically, what additional protections or outcomes are expected? The Minister gave some explanation of this, but I would welcome digging a little deeper into that.
Secondly, where consent is used as a defence against the charge of sharing intimate images, what are the Government’s thoughts on how to protect victims from intrusive cross-examination over details of their sexual history?
Thirdly, with respect to nudification technology, the previous Government argued that any photoreal image was covered by “intimate image abuse”—the noble Lord, Lord Clement-Jones, touched on this issue well. Is there any merit in looking at that again?
Fourthly, I am keen to hear the Government’s views on my noble friend Lady Owen’s Private Member’s Bill on nudification. We look forward to debating that in December.
Fifthly, and lastly, what role can or should parents and educators play in supporting the Act’s objectives? How will the Government engage these groups to promote online safety awareness?
My Lords, I thank noble Lords for their contributions to this debate. This is, as I think all noble Lords who have spoken recognise, a really important issue. It is important that we get this legislation right. We believe that updating the priority offences list with a new intimate image abuse offence is the correct, proportionate and evidence-led approach to tackle this type of content, and that it will provide stronger protections for online users. This update will bring us closer to achieving the commitment made in the Government’s manifesto to strengthening the protection for women and girls online.
I will try to cover all the questions asked. My noble friend Lord Stevenson and the noble Baroness, Lady Owen, asked whether we will review the Act and whether the Act is enough. Our immediate focus is on getting the Online Safety Act implemented quickly and effectively. It was designed to tackle illegal content and protect children; we want those protections in place as soon as possible. Having said that, it is right that the Government continually assess the law’s ability to keep up, especially when technology is moving so fast. We will of course look at how effective the protections are and build on the Online Safety Act, based on the evidence. However, our message to social media companies remains clear: “There is no need to wait. You can and should take immediate action to protect your users from these harms”.
The noble Baroness, Lady Owen, asked what further action we are taking against intimate abuse and about the taking, rather than sharing, of intimate images. We are committed to tackling the threat of violence against women and girls in all forms. We are considering what further legislative measures may be needed to strengthen the law on taking intimate images without consent and image abuse. This matter is very much on the Government’s agenda at the moment; I hope that we will be able to report some progress to the noble Baroness soon.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Owen, asked whether creating and making intimate image deepfakes will be an offence. The Government’s manifesto included a commitment to banning the creation of sexually explicit deepfakes. This is a priority for the Government. DSIT is working with the Home Office and the Ministry of Justice to identify the most appropriate legislative vehicle for ensuring that those who create these images without consent face the appropriate punishment. The Government are considering options in this space to protect women and girls from malicious uses of these technologies. The new sharing intimate images offence, which will be added to the OSA priority list through this SI, explicitly includes—for the first time—wholly synthetic manufactured images, such as deepfakes, so they will be tackled under the Online Safety Act.
The noble Baroness, Lady Owen, asked about the material that is already there and the ability to have a hash database to prevent those intimate images continually being circulated. We are aware that the technology exists. Strengthening the intimate image abuse priorities under the Act is a necessary first step to tackling this, but we expect Ofcom to consider this in its final draft illegal content codes and guidance and to give more information about both the codes of practice and the further measures that would need to be developed to address this issue.
Several noble Lords—the noble Viscount, Lord Camrose, the noble Lord, Lord Clement-Jones, and my noble friend Lord Stevenson—asked for more details on the new offences. As I tried to set out in my opening statement, the Online Safety Act repeals the offence of disclosing private sexual photographs and films with the intent to cause distress—this comes under Section 33 of the Criminal Justice and Courts Act 2015 and is commonly known as the revenge porn offence—and replaces it with four new offences.
First, there is a base offence of sharing an intimate image without consent, which carries a maximum penalty of six months’ imprisonment. Secondly, there are two specific-intent offences—the first is sharing an intimate image with intent to cause alarm, humiliation or distress; the second is sharing an intimate image for the purpose of obtaining sexual gratification—each of which carries a maximum penalty of two years’ imprisonment to reflect the more serious culpability of someone who acts without consent and with an additional malign intent. Lastly, there is an offence of threatening to share an intimate image, with a maximum penalty of two years’ imprisonment. This offence applies regardless of whether the image is shared.
These offences capture images that show, or appear to show, a person who is nude, partially nude, engaged in toileting or doing something sexual. These offences include the sharing of manufactured or manipulated images, which are referred to as deepfakes. This recognises that sharing intimate images without the consent of the person they show or appear to show is sufficiently wrongful or harmful to warrant criminalisation.
The noble Viscount, Lord Camrose, asked what is so different about these new offences compared to those in the Act. I stress that it is because they are being given priority status, which does not sound much but gives considerable extra powers under the Act. There will be new powers and new obligations on platforms. The key thing is that all those offences that already exist are being given priority status under the Online Safety Act. There are thousands of things that Ofcom could address, but this is now in the much smaller list of things that will place very specific obligations on the platforms. Ofcom will monitor this and, as I said earlier, companies can be fined huge sums of money if they do not act, so there is a huge obligation on them to follow through on the priority list.
I hope that I have answered all the questions and that noble Lords agree with me on the importance of updating the priority offences in the Online Safety Act. The noble Viscount, Lord Camrose, asked about parents and made an important point. This is not just about an Act, it is about everybody highlighting the fact that these activities are intolerable and offensive not just to the individuals concerned but to everybody in society, and parents have a responsibility, as we all do, to ensure that media literacy is at the height of the education we carry out formally in schools and informally within the home. The noble Viscount is absolutely right on that, and there is more that we could all do. I commend these regulations to the Committee.
(2 months, 1 week ago)
Lords ChamberThe noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.
What is the Government’s assessment of the technical difficulties behind requiring pornography sites and others to implement age-verification services?
(4 months, 3 weeks ago)
Lords ChamberThe noble Lord is right that there are issues around the risks in the way he has spelled out. There are still problems around the risks to accuracy of some AI systems. We are determined to push forward to protect people from those risks, while recognising the enormous benefits that there are from introducing AI. The noble Lord will know I am sure that it has a number of positive benefits in areas such as the health service, diagnosing patients more quickly—for example, AI can detect up to 13% more breast cancers than humans can. So there are huge advantages, but we must make sure that whatever systems are in place are properly regulated and that the risks are factored into that. Again, that will be an issue we will debate in more detail when the draft legislation comes before us.
My Lords, let me start by warmly welcoming the Minister to her new, richly deserved Front-Bench post. I know that she will find the job fascinating. I suspect she will find it rather demanding as well, but I look forward to working with her.
I have noted with great interest the Government’s argument that more AI-specific regulation will encourage more investment in AI in the country. That would be most welcome, but what do the Government make of the enormous difference between AI investment to date in the UK versus in the countries of the European Union subject to the AI Act? In the same vein, what do the Government make of Meta’s announcement last week that it is pausing some of its AI training activities because of the cumbersome and not always very clear regulation that is part of the AI Act?
Again, I thank the noble Viscount for his good wishes and welcome him to his new role. He is right to raise the comparison and, while the EU has introduced comprehensive legislation, we instead want to bring forward highly targeted legislation that focuses on the safety risks posed by the most powerful models. We are of course committed to working closely with the EU on AI and we believe that co-ordinating with international partners —the EU, the US and other global allies—is critical to making sure that these measures are effective.