(1 year, 2 months ago)
Grand CommitteeI start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.
Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.
However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.
On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.
On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.
Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:
“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,
may be a more adaptive solution.
Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.
I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.
Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.
Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?
Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.
My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.
We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.
A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.
Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.
Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to
“be subject to the approval of an independent ethics committee”.
Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.
We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.
Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?
Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.
Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.
Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.
I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.
However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.
Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.
Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.
Lord Cameron of Lochiel (Con)
My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?
I thank in particular the noble Lord, Lord Clement-Jones, who has clearly had his Weetabix this morning. I will comment on some of the many amendments tabled.
On Amendments 73, 75, 76, 77, 83 and 90, I agree it is concerning that the Secretary of State can amend such important legislation via secondary legislation. However, these amendments are subject to the affirmative procedure and, therefore, to parliamentary scrutiny. Since the DPDI Bill proposed the same, I have not changed my views; I remain content that this is the right level of oversight and that these changes do not need to be made via primary legislation.
As for Amendment 74, preventing personal health data from being considered a legitimate interest seems wise. It is best to err on the side of caution when it comes to sharing personal health data.
Amendment 77 poses an interesting suggestion, allowing businesses affiliated by contract to be treated in the same way as large businesses that handle data from multiple companies in a group. This would certainly be beneficial for SMEs collaborating on a larger project. However, each such business may have different data protection structures and terms of use. Therefore, while this idea certainly has merit, I am a little concerned that it may benefit from some refining to ensure that the data flows between businesses in a way to which the data subject has consented.
On Amendment 78A and Schedule 4 standing part, there are many good, legitimate interest reasons why data must be quickly shared and processed, many of which are set out in Schedule 4: for example, national security, emergencies, crimes and safeguarding. This schedule should therefore be included in the Bill to set out the details on these important areas of legitimate interest processing. Amendment 84 feels rather like the central theme of all our deliberations thus far today, so I will listen with great interest, as ever, to the Minister’s response.
I have some concerns about Amendment 85, especially the use of the word “publicly”. The information that may be processed for the purposes of safeguarding vulnerable individuals is likely to be deeply sensitive and should not be publicly available. Following on from this point, I am curious to hear the Minister’s response to Amendment 86. It certainly seems logical that provisions should be in place so that individuals can regain control of their personal data should the reason for their vulnerability be resolved. As for the remaining stand part notices in this group, I do not feel that these schedules should be removed because they set out important detail on which we will come to rely.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.
(1 year, 2 months ago)
Grand CommitteeMy Lords, I start by reflecting on the strangeness of the situation—to me, anyway. Here we all are again, in slightly different seats but with a largely similar Bill. As I said at Second Reading, we welcome this important Bill; it is absolutely crucial to get our data economy right. We have a number of amendments to the Bill, a great many of which are probing. The overall theme of our amendments is how to make the Bill maximally effective at the important job that it sets out to do.
The terminology of data law is well understood. Lawmakers, lawyers, businesses and data subjects are all to some extent familiar with the terminology. A “controller” means
“the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data”.
A “processor” means
“a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller”.
We are all familiar with those terms.
In this Bill, new terms are introduced, named “data holder” and “trader”. A data holder, in relation to customer data or business data of a trader is the trader, or
“a person who, in the course of a business, processes the data”.
How is that materially different from a processor? A trader is described as a person who supplies or provides
“goods, services or digital content”
in the course of business, whether personally, through someone acting in the trader’s name, or on the trader’s behalf. Again, I ask how that is different from a controller.
While I grant that this may seem a very small point in a very large Bill, already data regulations are relatively poorly understood and difficult to follow. Therefore, surely there is no real need to make them more complex by introducing overlapping terms just for this one section of the Bill. As I explained in our explanatory note, this is a probing amendment, and I hope the Minister will be able to explain why these terms are materially different from the existing terms, why they are necessary and so on. If so, I would of course be happy to withdraw my amendment. I beg to move.
Just to follow on from that, I very much support my noble friend’s words. The only reason I can see why you would introduce new definitions is that there are new responsibilities that are different, and you would want people to be aware of the new rules that have been placed on them. I will be interested to hear the Minister’s answer. If that is the case, we can set that out and understand whether the differences are so big that you need a whole new category, as my noble friend said.
Having run lots of small businesses myself, I am aware that, with every new definition that you add, you add a whole new set of rules and complications. As a business owner, how am I going to find out what applies to me and how I am to be responsible? The terms trader, controller, data holder and processor all sound fairly similar, so how will I understand what applies to me and what does not? To the other point that my noble friend made, the more confusing it gets, the less likelihood there is that people will understand the process.
First, let me say what a pleasure it is to be back on this old ground again, although with slightly different functions this time round. I very much support what the noble Viscount, Lord Camrose, said. We want to get the wording of this Bill right and to have a robust Bill; that is absolutely in our interests. We are on the same territory here. I thank the noble Viscount and other noble Lords for expressing their interest.
On Amendments 1 and 2, the Government consider the terms used in Part 1, as outlined in Clause 1, necessary to frame the persons and the data to which a scheme will apply. The noble Lord, Lord Clement-Jones, mentioned the powers. I assure him that the powers in Part 1 sit on top of the Data Protection Act. They are not there instead of it; they are another layer on top of it, and they provide additional rights over and above what already exists.
In relation to the specific questions from the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, smart data schemes require suppliers or providers of goods, services or digital content to provide data. They are referred to as “traders” in accordance with recent consumer legislation, including the Consumer Rights Act 2015. The term “data holder” ensures that the requirements may also be imposed on any third party that might hold the data on the trader’s behalf. That is why these additional terminologies have been included: it is based on existing good legislation. I hope noble Lords will recognise why this is necessary and that this explains the rationale for these terms. These terms are independent of terms in data protection legislation; they have a different scope and that is why separate terms are necessary. I hope that, on that basis, the noble Viscount will withdraw his amendment.
I thank the Minister for that explanation. I see the point she makes that, in existing legislation, these terms are used. I wonder whether there is anything we can do better to explain the terms. There seems to be significant overlap between processors, holders, owners and traders. The more we can do to clarify absolutely, with great rigour, what those terms mean, the more we will bring clarity and simplicity to this necessarily complex body of law.
I thank the Minister for explaining the rationale. I am satisfied that, although it may not be the most elegant outcome, for the time being, in the absence of a change to the 2015 Act that she references, we will probably have to grin and bear it. I beg leave to withdraw the amendment.
My Lords, Amendments 3, 4 and 20 seek to probe the Government’s position on the roles of the Secretary of State and the Treasury. Amendment 6 seeks to probe whether the Treasury or the Secretary of State shall have precedence when making regulations under this Bill.
Clarity over decision-making powers is critical to good governance, in particular over who has final decision rights and in what circumstances. Throughout Part 1 of the Bill, the Secretary of State and the Treasury are both given regulation-making powers, often on the same matter. Our concern is that having two separate Ministers and two departments responsible for making the same regulations is likely to cause problems. What happens if and when the departments have a difference of opinion on what these regulations should contain or achieve? Who is the senior partner in the relationship? When it comes to putting statute on paper, who has the final say, the Secretary of State or the Treasury?
All the amendments are probing and, at this point, simply seek greater clarification from the Government. If the Minister can explain why two departments are jointly responsible for the same regulations, why this is necessary and a good idea, and what provisions will be in place to avoid legislative confusion, I will be happy not to press the amendments.
The amendments in group 2 cover smart data and relate to the Secretary of State and the Treasury. Apart from the financial services sector clauses, most of the powers in Part 1, as well as the statutory spending authority in Clause 13, are imposed on the Secretary of State and the Treasury. That is the point that the noble Viscount made. These allow the relevant government departments to make smart data regulations. Powers are conferred on the Treasury as the department responsible for financial services, given the Government’s commitment to open banking and open financing. There is no precedence between the Secretary of State or the Treasury when using these powers, as regulations are likely to be made by the department responsible for the sector to which the smart data scheme applies, following, as with other regulations, the appropriate cross-government write-round and collective agreement procedures. I add that interdepartmental discussions are overseen by the Smart Data Council, which will give advice on this issue.
The noble Viscount raises concerns relating to Clause 13. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance, as a matter of regularity. It is for these reasons that I urge the noble Viscount not to press these amendments. These are standard procedures where the Treasury is involved and that is why more than one department is referenced.
I thank the Minister for that explanation. I am pleased to hear that these are standard procedures. Will she put that in writing, in a letter to me, explaining and setting it out so that we have it on the record? It is really important to understand where the decisions break down and to have a single point of accountability for all such decisions and, if it cannot be in the Bill, it could at least be explained elsewhere. Otherwise, I am happy to proceed with the explanation that she has kindly given.
I thank my noble friends Lord Lucas and Lord Arbuthnot for their Amendments 5, 34, 48, 200 and 202. They and other noble Lords who have spoken have powerfully raised some crucial issues in these amendments.
Amendment 5 addresses a key gap, and I take on board what my noble friend Lord Markham said, in how we manage and use customer data in specific contexts. At its heart, it seeks to enable effective communication between organisations holding customer data and customers themselves. The ability to communicate directly with individuals in a specified manner is vital for various practical reasons, from regulatory compliance to research purposes.
One clear example of where this amendment would be crucial is in the context of the Student Loans Company. Through this amendment, the Secretary of State could require the SLC to communicate with students for important purposes, such as conducting research into the outcomes of courses funded by loans. For instance, by reaching out to students who have completed their courses, the SLC could gather valuable insights into how those qualifications have impacted on their employment prospects, income levels or career trajectories. This is the kind of research that could help shape future educational policies, ensuring that loan schemes are working as intended and that the investments made in students’ education are yielding tangible benefits. This, in turn, would allow for better decision-making on future student loans funding and educational opportunities.
Amendment 34 from my noble friend Lord Arbuthnot proposes a welcome addition to the existing clause, specifically aiming to ensure that public authorities responsible for ascertaining key personal information about individuals are reliable in their verification processes and provide clear, accurate metadata on that information. This amendment addresses the essential issue of trust and reliability in the digital verification process. We increasingly rely on digital systems to confirm identity, and for these systems to be effective, we have to make sure that the core information they are verifying is accurate and consistent. If individuals’ key identifying details—date of birth, place of birth and, as we heard very powerfully, sex at birth—are not consistently or accurately recorded across various official databases, it undermines the integrity of the digital verification process. It is important that we have consistency across the public authorities listed in this amendment. By assessing whether these bodies are accurately verifying and maintaining this data, we can ensure uniformity in the information they provide. This consistency is essential for establishing a reliable foundation for digital verification.
When we consider the range of public services that rely on personal identification information, from the NHS and His Majesty’s Revenue and Customs to the Home Office, they are all responsible for verifying identity in some capacity. The amendment would ensure that the data they are using is robust, accurate and standardised, creating smoother interactions for individuals seeking public services. It reduces the likelihood of discrepancies that delay or prevent access to public services.
Amendment 48 would introduce important protections for the privacy and integrity of personal information disclosed by public authorities. In our increasingly digital world, data privacy has become one of the most pressing concerns for individuals and for society. By requiring public authorities to attest to the accuracy, integrity and clarity of the data they disclose, the amendment would help to protect the privacy of individuals and ensure that their personal information was handled with the proper care and respect.
My noble friend Lord Lucas’s Amendment 200 would introduce a data dictionary. It would allow the Secretary of State to establish regulations defining key terms used in digital verification services, birth and death registers, and public data more generally. I heard clearly the powerful arguments about sex and gender, but I come at the issue of data dictionaries from the angle of the efficiency, effectiveness and reusability of the data that these systems generate. The more that we have a data dictionary defining the metadata, the more we will benefit from the data used, whichever of these bodies generates the data itself. I am supportive of the requirement to use a data dictionary to provide standardised definitions in order to avoid confusion and ensure that data used in government services is accurate, reliable and consistent. The use of the negative resolution procedure would ensure that Parliament had oversight while allowing for the efficient implementation of these definitions.
Amendment 202 would create a national register for school admissions rules and outcomes in England. This would be a crucial step towards increasing transparency and ensuring fairness in the school admissions process, which affects the lives of millions of families every year. We want to ensure that navigating the school admissions system is not overly opaque and too complex a process for many parents. With different schools following different rules, criteria and procedures, it can, as my noble friend, Lord Lucas, pointed out, be difficult for families to know what to expect or how best to make informed decisions. The uncertainty can be especially challenging for those who are new to the system, those who face language barriers or those in areas where the school’s rules are not readily accessible or clear.
For many parents, particularly those in areas with complex school systems or scarce school places, access to clear, consistent information can make all the difference. This amendment would allow parents to see exactly how the school admissions process works and whether they were likely to secure a place at their preferred school. By laying out the rules in advance, the system would ensure that parents could make better informed decisions about which schools to apply to, based on criteria such as proximity, siblings or academic performance.
We want to ensure that parents understand how decisions are made and whether schools are adhering to the rules fairly. By requiring all schools to publish their admissions rules and the outcomes of their admissions process, the amendment would introduce a level of accountability. I join other noble Lords in strongly supporting this amendment, as it would create a more effective and efficient school admissions system that works for everyone.
My Lords, we have had a good and wide-ranging discussion on all this. I will try to deal with the issues as they were raised.
I thank the noble Lord, Lord Lucas, for the proposed Amendment 5 to Clause 2. I am pleased to confirm that the powers under Clauses 2 and 4 can already be used to provide customer data to customers or third parties authorised by them, and for the publication or disclosure of wider data about the goods or services that the supplier provides. The powers provide flexibility as to when and how the data may be provided or published, which was in part the point that the noble Viscount, Lord Camrose, was making. The powers may also be used to require the collection and retention of specific data, including to require new data to be gathered by data holders so that this data may be made available to customers and third parties specified by regulations.
I note in particular the noble Lord’s interest in the potential uses of these powers for the Student Loans Company. It would be for the Department for Education to consider whether the use of the smart data powers in Part 1 of the Bill may be beneficial in the context of providing information about student loans and to consult appropriately if so, rather than to specify it at this stage in the Bill. I hope the noble Lord will consider those points and how it can best be pursued with that department in mind.
On Amendments 34, 48 and 200, the Government believe that recording, storing and sharing accurate data is essential to deliver services that meet citizens’ needs. Public sector data about sex and gender is collected based on user needs for data and any applicable legislation. As noble Lords have said, definitions and concepts of sex and gender differ.
Amendment 48 would require that any information shared must be accurate, trusted and accompanied by meta data. Depending on the noble Lord’s intentions here, this could either duplicate existing protections under data protection legislation or, potentially, conflict with them and other legal obligations.
The measures in Part 2 of the Bill are intended to secure the reliability of the process by which citizens verify their data. It is not intended to create new ways to determine a person’s sex or gender but rather to allow people to digitally verify the facts about themselves based on documents that already exist. It worries me that, if noble Lords pursued their arguments, we could end up with a passport saying one thing and a digital record saying something different. We have to go back to the original source documents, such as passports and birth certificates, and rely on them for accuracy, which would then feed into the digital record—otherwise, as I say, we could end up pointing in two different directions.
I reassure the noble Lord, Lord Arbuthnot, that my colleague, Minister Clark, is due to meet Sex Matters this week to discuss digital verification services. Obviously, I am happy to encourage that discussion. However, to prescribe where public authorities can usefully verify “sex at birth”, as noble Lords now propose, extends well beyond the scope of the measures in the Bill, so I ask them to reflect on that and whether this is the right place to pursue those issues.
In addition, the Government recently received the final report of the Sullivan review of data, statistics and research on sex and gender, which explores some of these matters in detail. These matters are more appropriately considered holistically—for example, in the context of that report—rather than by a piecemeal approach, which is what is being proposed here. We are currently considering our response to that report. I hope noble Lords will consider that point as they consider their amendments; this is already being debated and considered elsewhere.
Amendment 202 seeks to create a national register of individual school admissions arrangements and outcomes, which can be used to provide information to parents to help them understand their chances of securing a place at their local school. I agree with the noble Lord that choosing a school for their child is one of the most important decisions that a parent can make. That is why admissions authorities are required to publish admission arrangements on their schools’ websites. They must also provide information to enable local authorities to publish an annual admissions prospectus for parents, including admissions arrangements and outcomes for all state schools in their area.
I refer the noble Lord, Lord Lucas, to the School Information (England) Regulations 2008, which require admission authorities and local authorities to publish prescribed information relating to admissions. Those protections are already built into the legislation, and if a local authority is not complying with that, there are ways of pursuing it. We believe that the existing approach is proportionate, reflects the diversity of admissions arrangements and local circumstances, and is not overly burdensome on schools or local authorities, while still enabling parents to have the information they need about their local schools.
I hope that, for all the reasons I have outlined, noble Lords will be prepared not to press their amendments.
My Lords, I am delighted that the Government have chosen to take forward the smart data schemes from the DPDI Bill. The ability seamlessly to harness and use data is worth billions to the UK economy. However, data sharing and the profit that it generates must be balanced against proper oversight.
Let me start by offering strong support to my noble friend Lord Arbuthnot’s Amendment 7. Personally, I would greatly welcome a more sophisticated and widespread insurance market for cyber protections. Such a market would be based on openly shared data; the widespread publication of that data, as set out in the amendment, could help to bring this about.
I also support in principle Amendments 8 and 10 in the name of the noble Lord, Lord Clement-Jones, because, as I set out on the previous group, there is real and inherent value in interoperability. However, I wonder whether the noble Lord might reconsider the term “machine readable” and change it to something— I do not think that I have solved it—a bit more like “digitally interoperable”. I just worry that, in practice, everything is machine-readable today and the term might become obsolete. I am keen to hear the Minister’s response to his very interesting Amendment 31 on the compulsion of any person to provide data.
I turn to the amendments in my name. Amendment 16 would insert an appeals mechanism by which a person is charged a fee under subsection (1). It is quite reasonable that persons listed under subsection (2)—that is, data holders, decision-makers, interface bodies, enforcers and others with duties or powers under these regulations —may charge a fee for the purposes of meeting the expenses they incur, performing duties or exercising powers imposed by regulations made under this part. However, there should be an appeals mechanism so that, in the event that a person is charged an unreasonable fee, they have a means of recourse.
Amendment 17 is a probing amendment intended to explore the rate at which interest accrues on money owed to specific public authorities for unpaid levies. Given that this interest will be mandated by law, do the Government intend to monitor the levels and, if so, how?
Amendment 18 is a probing amendment designed to explore how the Government intend to deal with a situation when a person listed under subsection (2) of this clause believes they have been charged a levy wrongly. Again, it is reasonable that an appeals mechanism be created, and this would ensure that those who considered themselves to have been wrongly charged have a means of recourse.
Amendment 19 is looking for clarification on how the Government envisage unpaid levies being recovered. I would be grateful if the Minister could set out some further detail on that matter.
Amendment 21 is a probing amendment. I am curious to know the maximum value of financial assistance that the Government would allow the Secretary of State or the Treasury to give to persons under Clause 13. I do not think it would be prudent for the Government to become a financial backstop for participants in smart data schemes, so on what basis is that maximum going to be calculated?
Amendment 22 follows on from those concerns and looks to ensure that there is parliamentary oversight of any assistance provided. I am most curious to hear the Minister’s comments on this matter.
Amendment 23 is a straightforward—I think—amendment to the wording. I feel that the phrase “reasonably possible” seems to open the door to almost limitless endeavours and therefore suggest replacing it with “reasonably practicable”.
On Amendment 25, easy access to the FCA’s policy regarding penalties and levies is important. That would allow oversight, not only parliamentary but by those who are directly or indirectly affected by decisions taken under this policy. I therefore believe the amendment is necessary, as a website is the most accessible location for that information. Furthermore, regular review is necessary to ensure that the policy is functioning and serving its purpose.
Amendments 26 and 27 return to the matter of an appeals process. I will not repeat myself too much, but it is important to be able to appeal penalties and to create a route by which individuals understand how they can go about doing so.
Amendment 28 would ensure that, when the Secretary of State and the Treasury review the regulations made under Part 1 of the Bill, they do so concurrently. This amendment would prevent separate reviews being conducted that may contradict each other or be published at different times; it would force the relevant departments to produce one review and to produce it together. This would be prudent. It would prevent the Government doing the same work twice, unnecessarily spending public money, and would prevent contradicting reviews, which may cause confusion and financial costs in the smart data scheme industry.
Lastly, Amendment 29, which would ensure that Section 10 of this part was subject to the affirmative procedure, would allow for parliamentary oversight of regulations made under this clause.
We are pleased that the Government have chosen to bring smart data schemes forward, but I hope the Minister can take my concerns on board and share with us some of the detail in her response.
My Lords, we have had a detailed discussion, and it may be that I will not be able to pick up all the points that noble Lords have raised. If I do not, I guarantee to write to people.
First, I want to pick up the issues raised by the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, about cybersecurity and cyber resilience. This Government, like previous Governments, take this issue hugely seriously. It is built into all our thinking. The noble Lord, and the noble Baroness in particular, will know that the advice we get on all these issues is top class. The Government are already committed to producing a cybersecurity and resilience Bill within this Parliament. We have all these things in hand, and that will underpin a lot of the protections that we are going to have in this Bill and others. I agree with noble Lords that this is a hugely important issue.
I am pleased to confirm that Clause 3(7) allows the regulations to impose requirements on third-party recipients in relation to the processing of data, which will include security-related requirements. So it is already in the Bill, but I assure noble Lords that it will be underpinned, as I say, by other legislation that we are bringing forward.
In relation to Amendments 8 and 10, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provision about the providing or publishing of business data and the format in which that must be provided. That may include relevant energy-related data. The noble Lord gave some very good examples about how useful those connections and that data could be; he was quite right to raise those issues.
Regarding Amendment 9, in the name of the noble Lord, Lord Clement-Jones, I am pleased to confirm that there is nothing to prevent regulations requiring the provision of business data to government departments, publicly owned bodies and local and regional authorities. This is possible through Clause 4(1)(b), which allows regulations to require provision of business data to a person of a specified description. I hope the noble Lord will look at those cross-references and be satisfied by them.
Noble Lords spoke about the importance of sensitive information in future smart data schemes. A smart data scheme about legal services is not currently under consideration. Having said that, the Government would have regard to the appropriateness of such a scheme and the nature of any data involved and would consult the sector and any other appropriate stakeholders if that was being considered. It is not at the top of our list of priorities, but the noble Lord might be able to persuade us that it would have some merit, and we could start a consultation based on that.
Amendments 16 to 22 consider fees and the safeguards applying to them, which were raised by the noble Viscount. Fees and levies, enabled by Clauses 11 and 12, are an essential mechanism to fund a smart data scheme. The Government consider that appropriate and proportionate statutory safeguards are already built in. For example, requirements in Clause 11(3) and Clause 12(2) circumscribe the expenses in relation to which fees or the levy may be charged, and the persons on whom they may be charged.
Capping the interest rate for unpaid money, which is one of the noble Viscount’s proposals, would leave a significant risk of circumstances in which it might be financially advantageous to pay the levy late. The Government anticipate that regulations would provide an appropriate mechanism to ensure payment of an amount that is reasonable in the context of a late payment that is proposed. Just as regulations may be made by the relevant government department, it is most appropriate for financial assistance to be provided by the government department responsible for the smart data scheme in question. Clause 13 is intended to provide statutory authority for that assistance as a matter of regularity.
Amendments 23 to 27 deal with the clauses relating to the FCA. Clause 15(3) is drafted to be consistent with the wording of established legislation which confers powers on the FCA, most notably the Financial Services and Markets Act 2000. Section 1B of that Act uses the same formulation, using the phrase
“so far as is reasonably possible”
in relation to the FCA’s general duties. This wording is established and well understood by both the FCA and the financial services sector as it applies to the FCA’s strategic and operational objectives. Any deviation from it could create uncertainty and inconsistency.
Amendment 24 would cause significant disruption to current data-sharing arrangements and fintech businesses. Reauthenticating this frequently with every data holder would add considerable friction to open banking services and greatly reduce the user experience—which was the point raised by the noble Lord, Lord Clement-Jones. For example, it is in the customer’s interest to give ongoing consent to a fintech app to provide them with real-time financial advice that might adapt to daily changes in their finances.
Many SMEs provide ongoing access to their bank accounts in order to receive efficient cloud accounting services. If they had to re-register frequently, that would undermine the basis and operability of some of those services. It could inhibit the adoption and viability of open banking, which would defeat one of the main purposes of the Bill.
My Lords, this sequence of amendments is concerned with the publication and availability of guidance. Decision-makers are individuals responsible for deciding if a person has satisfied the conditions for authorisation to receive customer or business data. They may publish guidance on how they intend to exercise their functions. Given the nature of these responsibilities, these individuals are deciding who can receive information pertaining to individuals and businesses. The guidelines which set out how decisions are taken should be easily accessible and the best place for this is on their websites.
Following on from this point, Amendment 12 would require this guidance to be reviewed annually and any changes to be published, again on decision-makers’ websites, at least 28 days before coming into effect. This would ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.
Amendments 13 and 14 seek to create similar requirements for enforcers—that is, a public authority authorised to carry out monitoring or enforcement of regulations under this part. Again, given the nature of these responsibilities, the guidelines should be easily accessible on the enforcer’s website and reviewed annually, with any changes published, again on their website, at least 28 days before coming into effect. This will, once again, ensure that the guidelines are fit for purpose and provide ample time for people affected by these changes to review them and act accordingly.
Finally, Amendment 15 would require the Secretary of State or the Treasury to provide guidance on who may be charged a fee under Clause 6(1) and to review it annually. Ensuring the regular review of guidelines will ensure their effectiveness, and the ready availability of guidelines will ensure that they are used and observed. I therefore believe that these amendments will be of benefit to the functioning of the Bill and should be given consideration by the Minister.
My Lords, I thank the noble Viscount, Lord Camrose, for those amendments. I will cover the final group of amendments to Part 1, dealing with smart data guidance.
On Amendments 11, 12, 13 and 14, which relate to the publishing of the guidelines, I am pleased to confirm that Clause 5(4) clarifies that regulations may make provisions about the providing or publishing of business data. This includes the location where they should be published, including, as the noble Viscount suggests, the website of the responsible person.
Furthermore, Clause 21 clarifies that regulation may make provision about the form and manner in which things must be done. That provision can be used to establish appropriate processes around the sharing of information and guidance, including its regular update, publication and sharing with the relevant person.
Amendment 15 refers to the amount of fee charged and how it should be determined. The power is already broad enough to allow the information to be reviewed as and when necessary, but to mandate that the review must take place at least once a year may be a bit restrictive. For these reasons, I ask the noble Viscount not to press his amendments.
I thank the noble Lord for his answers. I understand what he says, although I would be grateful if either he or the noble Baroness, Lady Jones, could summarise those points in writing because I did not quite capture them all. If I understand correctly, all the concerns that we have raised are dealt with in other areas of the Bill, but if they could write to me then that would be great. I beg leave to withdraw the amendment.
In an act that I hope he is going to repeat throughout, the noble Lord, Lord Clement-Jones, has fully explained all the amendments that I want to support, so I put on record that I agree fully with all the points he made. I want to add just one or two other points. They are mainly in the form of questions for the Minister.
Some users are more vulnerable to harms than others, so Amendment 33 would insert a new subsection 2B which mentions redress. What do the Government imagine for those who may be more vulnerable and how do they think they might use this system? Obviously, I am thinking about children, but there could be other categories of users, certainly the elderly.
That led me to wonder what consideration has been given to vulnerable users more generally and how that is being worked through. That led to me to question exactly how this system is going to interact with the age-assurance work that the IC is doing as a result of the Online Safety Act and make sure that children are not forced into a position where they have to show their identity in order to prove their age or, indeed, cannot prove their identity because they have been deemed to have been dealt with elsewhere in another piece of legislation. Because, actually, children do open bank accounts and do have to have certain sorts of ID.
That led me to ask what in the framework prevents service providers giving more information than is required. I have read the Bill; someone said earlier that it is skeletal. From what we know, you can separate pieces of information, attributes, from each other, but what is to prevent a service provider not doing so? This is absolutely crucial to the trust in and workings of this system, and it leads me to the inverse, Amendment 46, which asks how we can prevent this system being forced and thrust upon people. As the noble Lord, Lord Clement-Jones, set out, we need to make sure that people have the right not to use the system as well as the right to use it.
Finally, I absolutely agree with the noble Viscount, Lord Colville, and the amendment in the name of the noble Viscount, Lord Camrose: something this fundamental must come back to Parliament. With that, I strongly associate myself with the words of the noble Lord, Lord Clement-Jones, on all his amendments.
I thank noble Lords for their comments and contributions in what has been an absolutely fascinating debate. I have a couple of points to make.
I agree with the noble Lord, Lord Clement-Jones, on his Amendment 33, on ongoing monitoring, and his Amendment 50. Where we part company, I think, is on his Amendment 36. I feel that we will never agree about the effectiveness or otherwise of five-year strategies, particularly in the digital space. I simply do not buy that his amendment will have the desirable effects that the noble Lord wants.
I do not necessarily agree with the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that we should put extra burdens around the right to use non-digital methods. In my opinion, and I very much look forward to hearing from the Minister on this matter, the Act preserves that right quite well as it is. I look forward to the Government’s comments on that.
I strongly support the noble Viscount, Lord Colville, on his very important point about international standards. I had intended to sign his amendment but I am afraid that, for some administrative reason, that did not happen. I apologise for that, but I will sign it because I think that it is so important. In my opinion, not much of the Bill works in the absence of effective international collaboration around these matters. This is so important. We are particularly going to run up against this issue when we start talking about ADM, AI and copyright issues. It is international standards that will allow us to enforce any of the provisions that we put in here, so they are so important. I am more agnostic on whether this will happen via W3C, the ITU or other international standards bodies, but we really must go forward with the principle that international standards are what will get us over the line here. I look forward to hearing the Minister’s confirmation of the importance, in the Government’s view, of such standards.
Let me turn to the amendments listed in my name. Amendment 37 would ensure parliamentary oversight of the DVS trust framework. Given the volume of sensitive data that these services providers will be handling, it is so important that Parliament can keep an eye on how the framework operates. I thank noble Lords for supporting this amendment.
Amendment 40 is a probing amendment. To that end, I look forward to hearing the Minister’s response. Accredited conformity assessment bodies are charged with assessing whether a service complies with the DVS framework. As such, they are giving a stamp of approval from which customers will draw a sense of security. Therefore, the independence of these accreditation bodies must be guaranteed. Failing to do so would allow the industry to regulate itself. Can the Minister set out how the Government will guarantee the independence of these accreditation bodies?
Amendment 49 is also a probing amendment. It is designed to explore the cybersecurity measures that the Government expect of digital verification services. Given the large volume of data that these services will be handling, it is essential that the Government demand substantial cybersecurity measures. This is a theme that we are going to come back to again and again; we heard about it earlier, and I think that we will come on to more of this. As these services become more useful and more powerful, they present a bigger attack surface that we have to defend, and I look forward to hearing how we will do that.
I thank the noble Lords, Lord Clement-Jones and Lord Markham, the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for raising these topics around digital verification services. As I explained at Second Reading, these digital verification services already exist. They are already out there making all sorts of claims for themselves. With the new trust framework, we are trying to provide some more statutory regulation of the way that they operate. It is important that we have this debate and that we get it right, but some of the things we are doing are still work in progress, which is why we do not always have all the detailed answers that noble Lords are searching for here and why some powers have been left to the Secretary of State.
I shall go from the top through the points that have been raised. Amendments 33 and 43, tabled by the noble Lord, Lord Clement-Jones, and Amendment 40 tabled by the noble Viscount, Lord Colville, would require the trust framework to include rules on monitoring compliance and redress mechanisms and would require the Secretary of State to ensure the independence of accredited conformity assessment bodies. The noble Baroness, Lady Kidron, asked questions akin to those regarding redress for the vulnerable, and I will write to her setting out a response to that in more detail.
On the issue of redress mechanisms in the round, the scope of the trust framework document is solely focused on the rules that providers of digital verification services are required to follow. It does not include matters of governance. Compliance is ensured via a robust certification process where services are assessed against the trust framework rules. They are assessed by independent conformity assessment bodies accredited by the United Kingdom Accreditation Service, so some oversight is already being built into this model.
The Bill contains powers for the Secretary of State to refuse applications to the DVS register or to remove providers where he is satisfied that the provider has failed to comply with the trust framework or if he considers it necessary in the interests of national security. These powers are intended as a safety net, for example, to account for situations where the Secretary of State might have access to intelligence sources that independent conformity assessment bodies cannot assess and therefore will not be able to react to, or it could be that a particular failure of the security of one of these trust marks comes to light very quickly, and we want to act very quickly against it. That is why the Secretary of State has those powers to be able to react quickly in what might be a national security situation or some other potential leak of important data and so on.
In addition, conformity assessment bodies carry out annual surveillance audits and can choose to conduct spot audits on certified providers, and they have the power to withdraw certification where non-conformities are found. Adding rules on compliance would cut across that independent certification process and would be outside the scope of the trust framework. Those independent certification processes already exist.
Amendments 33, 41, 42, 44 and 45 tabled by the noble Lord, Lord Clement-Jones, would in effect require the creation of an independent appeals body to adjudicate on the refusal of an application to the DVS register and the implementation of an investigatory process applicable to refusal and removal from the DVS register. The powers of the Secretary of State in this regard are not without safeguards. They may be exercised only in limited circumstances after the completion of an investigatory process and are subject to public law principles, for example, reasonableness. They may also be challenged by judicial review.
To go back to the point I was making, it might be something where we would need to move quickly. Rather than having a convoluted appeals process in the way that the noble Lord was talking about, I hope he understands the need sometimes for that flexibility. The creation and funding of an independent body to adjudicate such a limited power would therefore be inappropriate.
It would be reassuring if the Minister could share with us some of the meetings that the Secretary of State or Ministers are having with those bodies on the subject of these internationally shared technical standards.
I might need to write to the noble Viscount, but I am pretty sure that that is happening at an official level on a fairly regular basis. The noble Viscount raises an important point. I reassure him that those discussions are ongoing, and we have huge respect for those international organisations. I will put the detail of that in writing to him.
I turn to Amendment 37, tabled by the noble Viscount, Lord Camrose, which would require the DVS trust framework to be laid before Parliament. The trust framework contains auditable rules to be followed by registered providers of digital verification services. The rules, published in their third non-statutory iteration last week on GOV.UK, draw on and often signpost existing technical requirements, standards, best practice, guidance and legislation. It is a hugely technical document, and I am not sure that Parliament would make a great deal of sense of it if it was put forward in its current format. However, the Bill places consultation on a statutory footing, ensuring that it must take place when the trust framework is being prepared and reviewed.
Amendments 36 and 38, tabled by the noble Lord, Lord Clement-Jones, would create an obligation for the Secretary of State to reconsult and publish a five-year strategy on digital verification services. It is important to ensure that the Government have a coherent strategy for enabling the digital verification services market. That is why we have already consulted publicly on these measures, and we continue to work with experts. However, given the nascency of the digital identity market and the pace of those technological developments, as the noble Viscount, Lord Camrose, said, forecasting five years into the future is not practical at this stage. We will welcome scrutiny through the publication of the annual report, which we are committed to publishing, as required by Clause 53. This report will support transparency through the provision of information, including performance data regarding the operation of Part 2.
Amendment 39, also tabled by the noble Lord, Lord Clement-Jones, proposes to exclude certified public bodies from registering to provide digital verification services. We believe that such an exclusion could lead to unnecessary restrictions on the UK’s young digital verification market. The noble Lord mentioned the GOV.UK One Login programme, which is aligned with the standards of the trust framework but is a separate government programme which gives people a single sign-on service to access public services. It uses different legal powers to operate its services from what is being proposed here. We do not accept that we need to exclude public bodies from the scrutiny that would otherwise take place.
Amendment 46 seeks to create a duty for organisations that require verification and use digital verification for that purpose to offer, where reasonably practicable, a non-digital route and ensure that individuals are made aware of both options for verification. I should stress here that the provision in the Bill relates to the provision of digital verification services, not requirements on businesses in general about how they conduct verification checks.
Ensuring digital inclusion is a priority for this Government, which is why we have set up the digital inclusion and skills unit within DSIT. Furthermore, there are already legislative protections in the Equality Act 2010 in respect of protected groups, and the Government will take action in the future if evidence emerges that people are being excluded from essential products and services by being unable to use digital routes for proving their identity or eligibility.
The Government will publish a code of practice for disclosure of information, subject to parliamentary review, highlighting best practice and relevant information to be considered when sharing information. As for Amendment 49, the Government intend to update this code only when required, so an annual review process would not be necessary. I stress to the Committee that digital verification services are not going to be mandatory. It is entirely voluntary for businesses to use them, so it is up to individuals whether they use that service or not. I think people are feeling that it is going to be imposed on people, and I would push against that proposal.
If the regulation-making power in Amendment 50 proposed by the noble Lord, Lord Clement-Jones, was used, it would place obligations on the Information Commissioner to monitor the volume of verification checks being made, using the permissive powers to disclose information created in the clause. The role of the commissioner is to regulate data protection in the UK, which already includes monitoring and promoting responsible data-sharing by public authorities. For the reasons set out above, I hope that noble Lords will feel comfortable in not pressing their amendments.
My Lords, Amendment 47 is in another slightly peculiar group, but we will persevere. It aims to bolster the cybersecurity framework for digital verification services providers. Needless to say, as we continue to advance in the digital age, it is vital that our online systems, especially those handling sensitive information, are protected against ever-evolving cyberthreats. As DVSs gain in currency as they gain in usage, the incentive for cyberattackers to attack them and try to take advantage grows. They need to be protected.
The proposed amendment therefore mandates the creation and regular review of cybersecurity rules for all DVS providers. These rules are designed to ensure that services involved in verifying identities and other critical data maintain the highest standards of protection, resilience and trustworthiness consonant with their importance and the sensitivity of any breaches of that data.
We could hardly be more aware that we live in an increasingly digital world where almost every aspect of our lives is connected online. Digital verification services play a key role in this landscape, and that role is going to increase. They are used by individuals and organisations to confirm identities, authenticate transactions and verify data. These services underpin critical areas, such as banking, healthcare and public services, where security is paramount. However, as the cyberthreat landscape becomes more sophisticated, so does the need for robust security measures to protect these services. Hackers and malicious actors are continuously developing new ways to exploit vulnerabilities in digital systems. This puts personal data, business operations and even national security at risk.
A security breach in a digital verification system could have devastating consequences not only for the immediate victims but for the reputation and integrity of the service providers. That is why we on these Benches feel that the proposed amendment is absolutely critical. It would ensure that all DVS providers are held to a high, standardised set of cybersecurity practices. This would not only reduce the risk of cyberthreats but build greater public trust in the safety and reliability of those services and, therefore, enhance their uptake.
One of the key aspects of the amendment is the requirement for the cybersecurity rules to be reviewed annually. This is especially important in the context of the rapid evolution of the cyberthreats that we face. Technologies, attack methods and vulnerabilities are constantly changing, and what is secure today may not be secure tomorrow. By reviewing the cyber rules every year, we will ensure that they remain current and effective in protecting against the latest threats. I beg to move.
I support that. I completely agree with all the points that the noble Lord, Lord Clement-Jones, made on the previous groupings, but the one that we all agree is absolutely vital is the one just brought up by my noble friend. Coming from the private sector, I am all in favour of a market—I think that it is the right way to go—but standards within that are equally vital.
I come at this issue having had the misfortune of having to manage the cyberattack that we all recall happening against our diagnostic services in hospitals last summer. We found that the weakest link there was through the private sector supplier to that system, and it became clear that the health service—or cybersecurity, or whoever it was—had not done enough to make sure that those standards were set, published and adhered to effectively.
With that in mind, and trying to learn the lessons from it, I think that this clause is vital in terms of its intent, but it will be valuable only if it is updated on a frequent basis. In terms of everything that we have spoken about today, and on this issue in particular, I feel that that point is probably the most important. Although everything that we are trying to do is a massive advance in terms of trying to get the data economy to work even better, I cannot emphasise enough how worrying that attack on our hospitals last summer was at the time.
I thank both noble Lords for raising this; I absolutely concur with them on how important it is. In fact, I remember going to see the noble Viscount, Lord Camrose, when he was in his other role, to talk about exactly this issue: whether the digital verification services were going to be robust enough against cyberattacks.
I pray in aid the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Jones, who both felt that the new Cyber Security and Resilience Bill will provide some underpinning for all of this, because our Government take this issue very seriously. As the Committee can imagine, we get regular advice from the security services about what is going on and what we need to do to head it off. Yes, it is a difficult issue, but we are doing everything we can to make sure that our data is safe; that is fundamental.
Amendment 47 would require the Secretary of State to prepare and publish rules on cybersecurity for providers to follow. The existing trust framework includes rules on cybersecurity, against which organisations will be certified. Specifically, providers will be able to prove either that they meet the internationally recognised information security standards or that they have a security management system that matches the criteria set out in the trust framework.
I assure noble Lords that the Information Commissioner’s Office, the National Cyber Security Centre and other privacy stakeholders have contributed to the development of the trust framework. This includes meeting international best practice around encryption and cryptology techniques. I will happily write to noble Lords to reassure them further by detailing the range of protections already in place. Alternatively, if noble Lords here today would benefit from an official technical briefing on the trust framework, we would be delighted to set up such a meeting because it is important that we all feel content that this will be a robust system, for exactly the reasons that the noble Lord, Lord Markham, explained. We are absolutely on your Lordships’ side and on the case on all this; if it would be helpful to have a meeting, we will certainly do that.
I thank the Minister and my noble friend Lord Markham for those comprehensive and welcome comments. I would certainly like to take up the Minister’s offer of a technical briefing on the trust framework; that really is extremely important.
To go briefly off-piste, one sign that we are doing this properly will be the further development of an insurance marketplace for cybersecurity. It exists but is not very developed at the moment. As and when this information is regularly published and updated, we will see products becoming available that allow people to take insurance based on known risks around cybersecurity.
As I say, I take comfort from the Minister’s words and look forward to attending the tech briefing. When it comes, the cyber Bill will also play a serious role in this space and I look forward to seeing how, specifically, it will interact with DVS and the other services that we have been discussing and will continue to discuss. I beg leave to withdraw my amendment.
My Lords, I support these amendments and applaud the noble Lord, Lord Clement-Jones, for his temerity and for offering a variety of choices, making it even more difficult for my noble friend to resist it.
It has puzzled me for some time why the Government do not wish to see a firm line being taken about digital theft. Identity theft in any form must be the most heinous of crimes, particularly in today’s world. This question came up yesterday in an informal meeting about a Private Member’s Bill due up next Friday on the vexed question of the sharing of intimate images and how the Government are going to respond to it. We were sad to discover that there was no support among the Ministry of Justice officials who discussed the Bill with its promoter for seeing it progress any further.
At the heart of that Bill is the same question about what happens when one’s identity is taken and one’s whole career and personality are destroyed by those who take one’s private information and distort it in such a way that those who see it regard it as being a different person or in some way involved in activities that the original person would never have been involved in. Yet we hear that the whole basis on which this digital network has been built up is a voluntary one, and the logic of that is that it would not be necessary to have the sort of amendments that are before us now.
I urge the Government to think very hard about this. There must be a break point here. Maybe the meeting that has been promised will help us, but there is a fundamental point about whether in the digital world we can rely on the same protections that we have in the real world—and, if not, why not?
My Lords, I will address the amendments proposed by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron. I have nothing but the deepest respect for their diligence, and indeed wisdom, in scrutinising all three flavours of the Bill as it has come out, and for their commitment to strengthening the legislative framework against fraud and other misuse of digital systems. However, I have serious reservations about the necessity and proportionality of the amendments under consideration, although I look forward to further debates and I am certainly open to being convinced.
Amendments 51 and 52 would introduce criminal sanctions, including imprisonment, for the misuse of trust marks. While the protection of trust marks is vital for maintaining public confidence in digital systems, I am concerned that introducing custodial sentences for these offences risks overcriminalisation. The misuse of trust marks can and should be addressed through robust civil enforcement mechanisms. Turning every such transgression into a criminal matter would place unnecessary burdens on, frankly, an already strained justice system and risks disproportionately punishing individuals or small businesses for inadvertent breaches.
Furthermore, the amendment’s stipulation that proceedings could be brought only by or with the consent of the Director of Public Prosecutions or the Secretary of State is an important safeguard, yet it underscores the high level of discretion required to enforce these provisions effectively, highlighting the unsuitability of broad criminalisation in this context.
Amendment 53 seeks to expand the definition of identity documents under the Identity Documents Act 2010 to include digital identity documents. While the noble Lord, Lord Clement-Jones, makes a persuasive case, the proposal raises two concerns. First, it risks pre-emptively criminalising actions before a clear and universally understood framework for digital identity verification is in place. The technology and its standards are still evolving, and it might be premature to embed such a framework into criminal law. Secondly, there is a risk that this could have unintended consequences for innovation in the digital identity sector. Businesses and individuals navigating this nascent space could face disproportionate legal risks, which may hinder progress in a field critical to the UK’s digital economy.
Amendment 54 would introduce an offence of knowingly or recklessly providing false information in response to notices under Clause 51. I fully support holding individuals accountable for deliberate deception, but the proposed measure’s scope could lead to serious ambiguities. What constitutes recklessness in this context? Are we inadvertently creating a chilling effect where individuals or businesses may refrain from engaging with the system for fear of misinterpretation or error? These are questions that need to be addressed before such provisions are enshrined in law.
We must ensure that our legislative framework is fit for purpose, upholds the principles of justice and balances enforcement with fairness. The amendments proposed, while they clearly have exactly the right intentions, risk, I fear, undermining these principles. They introduce unnecessary criminal sanctions, create uncertainty in the digital identity space and could discourage good-faith engagement with the regulatory system. I therefore urge noble Lords to carefully consider the potential consequences of these amendments and, while expressing gratitude to the noble Lords for their work, I resist their inclusion in the Bill.
My Lords, of course we want to take trust seriously. I could not agree more that the whole set of proposals is predicated on that. Noble Lords have all made the point, in different ways, that if there is not that level of trust then people simply will not use the services and we will not be able to make progress. We absolutely understand the vital importance of all that. I thank all noble Lords for their contributions on this and I recognise their desire to ensure that fraudulent use of the trust mark is taken seriously, as set out in Amendments 51 and 52.
The trust mark is in the process of being registered as a trademark in the UK. As such, once that is done, the Secretary of State will be able to take appropriate legal action for misuse of it. Robust legal protections are also provided through Clause 50, through the trademark protections, and through other existing legislative provisions, such as the Consumer Protection from Unfair Trading Regulations 2008. There is already legislation that underpins the use of that trust mark. Additionally, each trust mark will have a unique number that allows users to check that it is genuine. These amendments would duplicate those existing protections.
In seeking to make the misuse of a digital identity a criminal offence, which Amendments 53 and 209 attempt to do, the noble Lord offered me several different ways of approaching this, so I will offer him some back. The behaviour he is targeting is already addressed in the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018. We would argue that it is already by existing legislation.
On the noble Lord’s point about the Identity Documents Act 2010, defining every instance of verification as an identity document within the scope of offences in that Act could create an unclear, complicated and duplicative process for the prosecution of digital identity theft. The provision of digital verification services does not always create one single comprehensive identity proof—I think this is the point that the noble Viscount, Lord Camrose, was making. People use it in different ways. It might be a yes/no check to ensure that a person is over 18, or it might be a digital verification services provider providing several derived credentials that can be used in different combinations for different use cases. We have to be flexible enough to be able to deal with that and not just make one fraudulent act. It would not be appropriate to add digital identity to the list of documents set out in the Identity Documents Act.
Amendment 54 would create an offence of supplying false information to the Secretary of State, but sanctions already exist in this situation, as the organisation can be removed from the DVS register via the power in Clause 41. Similarly, contractual arrangements between the Office for Digital Identities and Attributes and conformity assessment bodies require them to adhere to the principle of truthfulness and accuracy. To create a new offence would be disproportionate when safeguards already exist. I take on board the intent and aims of the noble Lord, Lord Clement-Jones, but argue that there are already sufficient protections in current law and in the way in which the Bill is drafted to provide the reassurance that he seeks. Therefore, I hope that he feels comfortable in not pressing his amendment.
My Lords, I am confident that, somewhere, there is a moral philosopher and legal scholar who can explain why this amendment is not part of the next group on NUAR but, in the meantime, my amendment addresses a fundamental issue. It would ensure that strict security measures are in place before any individual or organisation is allowed access to the sensitive information held on the National Underground Asset Register. The NUAR is a crucial tool for managing the UK’s underground infrastructure. It holds critical data about pipelines, cables and other assets that underpin vital services such as water, energy, telecommunications and transport.
This information, while essential for managing and maintaining infrastructure, is also a potential target for misuse. As such, ensuring the security of this data is not just important but vital for the safety and security of our nation. The information contained in the NUAR is sensitive. Its misuse could have disastrous consequences. If this data were to fall into the wrong hands, whether through criminal activities, cyberattacks or terrorism, it could be exploited to disrupt or damage critical infrastructure. I know that the Government take these risks seriously but this amendment seeks to address them further by ensuring that only those with a legitimate need, who have been properly vetted and who have met specific security requirements can access this data. We must ensure that the people accessing this register are trusted individuals or organisations that understand the gravity of handling this sensitive information and are fully aware of the risks involved.
The amendment would ensure that we have a framework for security—one that demands that the Secretary of State introduces clear, enforceable regulations specifying the security measures that must be in place before anyone can access the NUAR. These measures may include: background checks to ensure that those seeking access are trustworthy and legitimate; cybersecurity safeguards to prevent unauthorised digital access or breaches; physical security measures to protect the infrastructure where this information is stored; and clear guidelines on who should be allowed access and the conditions under which they can view this sensitive data.
The potential threats posed by unsecured access to the NUAR cannot be overstated. Criminals could exploit this information to target and disrupt key infrastructure systems. Terrorist organisations could use it to plan attacks on essential services, endangering lives and causing mass disruption. The stakes are incredibly high; I am sure that I do not need to convince noble Lords of that. In an era where digital and physical infrastructure are increasingly interconnected, the risks associated with unsecured access to information of the kind held in the NUAR are growing every day. This amendment would address this concern head on by requiring that we implement safeguards that are both thorough and resilient to these evolving threats. Of course, the cyber Bill is coming, but I wonder whether we need something NUAR-specific and, if so, whether we need it in this Bill. I beg to move.
I thank the noble Viscount for raising the issue of the National Underground Asset Register’s cybersecurity. As he said, Amendment 55 seeks to require more detail on the security measures in the regulations that will be applied to the accessing of NUAR data.
The noble Viscount is right: it is absolutely fundamental that NUAR data is protected, for all the reasons he outlined. It hosts extremely sensitive data. It is, of course, supported by a suite of sophisticated security measures, which ensure that the very prescribed users’ access to data is proportionate. I hope that the noble Viscount understands that we do not necessarily want to spell out what all those security measures are at this point; he will know well enough the sorts of discussions and provisions that go on behind the scenes.
Security stakeholders, including the National Cyber Security Centre and the National Protective Security Authority, have been involved in NUAR’s development and are members of its security governance board, which is a specific governance board overseeing its protection. As I say, access to it occurs on a very tight basis. No one can just ask for access to the whole of the UK’s data on NUAR; it simply is not geared up to be operated in that way.
We are concerned that the blanket provision proposed in the amendment would lead to the publication of detailed security postures, exposing arrangements that are not public knowledge. It could also curtail the Government’s ability to adapt security measures when needed and, with support from security stakeholders, to accommodate changing circumstances—or, indeed, changing threats—that we become aware of. We absolutely understand why the noble Viscount wants that reassurance. I can assure him that it is absolutely the best security system we could possibly provide, and that it will be regularly scrutinised and updated; I really hope that the noble Viscount can take that assurance and withdraw his amendment.
I thank the Minister for that answer. Of course, I take the point that to publish the security arrangements is somehow to advertise them, but I am somehow not yet altogether reassured. I wonder whether there is something that we can push further as part of a belt-and-braces approach to the NUAR security arrangements. We have talked about cybersecurity a lot this afternoon. All of these things tend to create additional incentives towards cyberattacks —if anything, NUAR does so the most.
If it helps a little, I would be very happy to write to the noble Viscount on this matter.
Yes, that would be great. I thank the Minister. I beg leave to withdraw my amendment.
I thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.
In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—
For the record, the noble Viscount is getting a vigorous nod from the Minister.
We will see, but such a demonstration would certainly ease any perfectly reasonable concerns that might emerge. To put it in a more colourful way, this is Netflix in the age of Blockbuster Video.
The slightly different Amendments 193, 194 and 195 clarify that these information standards should explicitly apply to IT providers involved in the processing of data within primary as well as secondary care, and that the standards must extend to existing contracts with providers, not just new agreements formed after this Act. I understand the point of these amendments but I am slightly concerned about how the retroactivity would affect existing contractual agreements. I am also slightly concerned about the wish to hard-code certain conditions into rules that function best the more they are principles-based and the less they are specifically related to particular areas of technology. That said, I think I am persuadable on it, but I have not yet made that leap.
I am not going to say much except to try to persuade my noble friend. I am absolutely with the intent of what the noble Lord, Lord Clement-Jones, is trying to do here and I understand the massive benefits that can be gained from it.
My Lords, there is a great deal to be gained from digitising the registers of births, stillbirths and deaths. Not only does it reduce the number of physical documents that need to be maintained and kept secure but it means that people do not have to physically sign the register of births or deaths in the presence of a registrar. This will make people’s lives a great deal easier during those stressful periods of their lives.
However, digitising all this data—I am rather repeating arguments I made about NUAR and other things earlier—creates a much larger attack surface for people looking to steal personal data. This amendment explores how the Government will protect this data from malign actors. If the Minister could provide further detail on this, I would be most grateful.
This is a probing amendment and has been tabled in a constructive spirit. I know that we all want to harness the power of data and tech in this space and use it to benefit people’s lives but, particularly with this most personal of data, we have to take appropriate steps to keep it secure. Should there be a data breach, hackers would have access to an enormous quantity of personal data. Therefore, I suggest that, regardless of how much thought the Government have given this point up to now, the digitisation of these registers should not occur until substantial cybersecurity measures are in place. I look forward to the Minister’s comments.
On Amendment 57, legislation is already in place to ensure the security of electronic registers. Articles 25 and 32 of the UK General Data Protection Regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures, so this already applies.
The electronic system has been in place for births and deaths since 2009, and all events have been registered electronically since that date, in parallel with the paper registers and with no loss of data. What is happening with this legislation is that people do not have to keep paper records anymore; it is about the existing electronic system. The noble Lord will remember that it is up to registrars even so, but I think that the idea is that they will no longer have to keep the paper registers as well, which everybody felt was an unnecessary administrative burden.
Nevertheless, the system is subject to Home Office security regulations, and robust measures are in place to protect the data. There has been no loss of data or hacking of that data up to now. Obviously, we need to make sure that the security is kept up to date, but we think that it is a pretty robust system. It is the paper documents that are losing out here.
I thank the Minister. I take the point that this has been ongoing for a while and that, in fact, the security is better because there is less reliance on the paper documents. That said, I am encouraged by her answer and encouraged that the Government continue to anticipate this growing risk and act accordingly. On that basis, I withdraw the amendment.
My Lords, it occurred to me when the noble Lord was speaking that we had lost a valuable member of our Committee. This could not be the noble Lord, Lord Clement-Jones, who was speaking to us just then. It must have been some form of miasma or technical imposition. Maybe his identity has been stolen and not been replaced. Normally, the noble Lord would have arrived with a short but punchy speech that set out in full how the new scheme was to be run, by whom, at what price, what its extent would be and the changes that would result. The Liberal future it may have been, but it was always delightful to listen to. I am sad that all the noble Lord has asked for here is a modest request, which I am sure the noble Baroness will want to jump to and accept, to carry out a review—as if we did not have enough of those.
Seriously, I once used the service that we have been talking about when my father-in-law died, and I found it amazing. It was also one that I stumbled on and did not know about before it happened. Deaths did not happen often enough in my family to make me aware of it. But, like the noble Lord, Lord Clement-Jones, I felt that it should have done much more than what it did, although it was valuable for what it did. It also occurred to me, as life moved on and we produced children, that there would be a good service when introducing a new person—a service to tell you once about that, because the number of tough issues one has to deal with when children are born is also extraordinary and can be annoying, if you miss out on one—particularly with the schooling issues, which are more common these days than they were when my children were being born.
I endorse what was said, and regret that the amendment perhaps did not go further, but I hope that the Minister when she responds will have good news for us.
I thank the noble Lord, Lord Clement-Jones, for raising this, and the noble Lord, Lord Stevenson, for raising the possibility that we are in the presence of a digital avatar of the noble Lord, Lord Clement-Jones. It is a scary thought, indeed.
The amendment requires a review of the operation of the Tell Us Once programme, which seeks to provide a simpler mechanism for citizens to pass information regarding births and deaths to the Government. It considers whether the pioneering progress of Tell Us Once could be extended to non-public sector holders of data. When I read the amendment, I was more cynical than I am now, having heard what the noble Lord, Lord Clement-Jones, had to say. I look forward to hearing the Minister’s answers. I take the point from the noble Lord, Lord Stevenson, that we do not necessarily need another review—but now that I have heard about it, it feels a better suggestion than I thought it was when reading about it.
I worry that expanding this programme to non-public sector holders of data would be a substantial undertaking; it would surely require the Government to hold records of all the non-public sector organisations that have retained and processed an individual’s personal data. First, I am not sure that this would even be possible—or practicable, anyway. Secondly, I am not sure that it would end up being an acceptable level of state surveillance. I look forward to hearing the Minister’s response but I am on the fence on this one.
(1 year, 2 months ago)
Lords ChamberMy Lords, let me start by repeating the thanks others have offered to the Minister for her ongoing engagement and openness, and to the Bill team for their—I hope ongoing—helpfulness.
Accessing and using data safely is a deeply technical legislative subject. It is, perhaps mysteriously, of interest to few but important to more or less everyone. Before I get started, I will review some of the themes we have been hearing about. Given the hour, I will not go into great detail about most of them, but I think it is worth playing some of them back.
The first thing that grabbed me, which a number of noble Lords brought up, was the concept of data as an asset. I believe the Minister used the phrase “data as DNA”, and that is exactly the right metaphor. Whether data is a sovereign asset or on the balance sheet of a private organisation, that is an incredibly important and helpful way to see it. A number of noble Lords brought this up, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Knight and Lord Stevenson of Balmacara.
I was pleased that my noble friend Lord Lucas brought up the use of AI in hiring, if only because I have a particular bee in my bonnet about this. I have taken to writing far too many grumpy letters to the Financial Times about it. I look forward to engaging with him and others on that.
I was pleased to hear a number of noble Lords raise the issue of the burdens on small business and making sure that those burdens, in support of the crucial goal of protecting privacy, do not become disproportionate relative to the ability of small businesses to execute against them. The noble and learned Lord, Lord Thomas, the noble Lords, Lord Stevenson of Balmacara and Lord Bassam, and my noble friend Lord Markham brought that up very powerfully.
I have cheated by making an enormous group of themes, including ADM, AI and text and data mining—and then I have added Horizon on at the end. It is thematically perhaps a little ambitious, but we are getting into incredibly important areas for the well-being and prosperity of so many people. A great many noble Lords got into this very persuasively and compellingly, and I look forward to a great deal of discussion of those items as we go into Committee.
Needless to say, the importance of adequacy came up, particularly from the noble Lords, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas. There is a key question here: have we reduced the risk of loss of adequacy to as close to zero as we can reasonably get, while recognising that it is a decision that is essentially out of our sovereign hands?
A number of noble Lords brought up the very tricky matter of the definition of scientific research—among them the noble Viscount, Lord Colville, my noble friend Lord Bethell and the noble Lords, Lord Davies of Brixton and Lord Freyberg. This is a significant challenge to the effectiveness of the legislation. We all know what we are trying to achieve, but the skill and the art of writing it down is a considerable challenge.
My final theme, just because I so enjoyed the way in which it was expressed by the noble Lord, Lord Knight, is the rediscovery of the joys of a White Paper. That is such an important point—to have the sense of an overall strategy around data and technology as well as around the various Bills that came through in the previous Parliament and will, of course, continue to come now, as these technologies develop so rapidly.
My noble friend Lord Markham started by saying that we on these Benches absolutely welcome the Government’s choice to move forward with so many of the provisions originally set out in the previous Government’s DPDI Bill. That Bill was built around substantial consultation and approved by a range of stakeholders. We are particularly pleased to see the following provisions carried forward. One is the introduction of a national underground asset register. As many others have said, it will not only make construction and repairs more efficient but make them safer for construction workers. Another is giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of any child death. I notice the noble Baroness, Lady Kidron, nodding at that—and I am delighted that it remains.
On reforming and modernising the ICO, I absolutely take the point raised by some that this is an area that will take quite considerable questioning and investigation, but overall the thrust of the purpose of modernising that function is critical to the success of the Bill. We absolutely welcome the introduction of a centralised digital ID verification framework, recognising noble Lords’ concerns about it, of course, and allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes.
That said, there are provisions that were in the old DPDI Bill whose removal we regret, many of which we felt would have improved data protection and productivity by offering SMEs in particular greater agency to deal with non-high-risk data in less cumbersome ways while still retaining the highest protections for high-risk data. I very much welcome the views so well expressed by the noble and learned Lord, Lord Thomas of Cwmgiedd, on this matter. As my noble friend Lord Markham put it, this is about being wisely careful but not necessarily hyper-careful in every case. That is at least a way of expressing the necessary balance.
I regret, for example—the noble Lord, Lord Clement-Jones, possibly regrets this less than I do—that the Government have chosen to drop the “vexatious and excessive” standard for subject access requests to refer to “manifestly unfounded or excessive”. The term “vexatious” emerged from extensive consultation and would, among other things, have prevented the use of SARs to circumvent courts’ discovery processes. I am concerned that, by dropping this definition, the Government have missed an opportunity to prevent misuse of the deeply important subject access rights. I hope very much to hear from the Minister how the Government propose to address such practices.
In principle, we do not approve of the Government giving themselves the power to gain greater knowledge of citizens’ activities. Indeed, the Constitution Committee has made it clear that any legislation dealing with data protection must carefully balance the use of personal data by the state for the provision of services and for national security purposes against the right to a private life and freedom of expression. We on these Benches feel that, on the whole, the DPDI Bill maintained the right balance between those two opposing legislative forces. However, we worry that the DUA Bill, if used in conjunction with other powers that have been promised in the fraud, error and debt Bill, would tip too far in favour of government overreach.
Part 1 of the Bill, on customer and business data, contains many regulation-making powers. The noble Viscount, Lord Colville, my noble friend Lord Holmes and the noble Lord, Lord Russell, spoke powerfully about this, and I would like to express three concerns. First, the actual regulations affecting vast quantities of business and personal data are not specified in the Bill; they will be implemented through secondary legislation. Will the Minister give us some more information, when she stands up, about what these regulations may contain? This concern also extends to Part 2, on digital verification services, where in Clause 28,
“The Secretary of State must prepare and publish … rules concerning the provision of digital verification services”.
The Select Committee on the Constitution has suggested that this power should be subject to parliamentary scrutiny. I must say that I am minded to agree.
Secondly, throughout Part 1, regulation-making powers are delegated to both the Secretary of State and the Treasury. This raises several questions. Can the Secretary of State and the Treasury make regulations independently of one another? In the event of a disagreement between these government departments, who has the final say, and what are the mechanisms should they disagree? We would welcome some commentary and explanation from the Minister.
Thirdly, as the Select Committee on the Constitution has rightly pointed out, Clause 133 contains a Henry VIII power. It allows the Secretary of State, by regulations, to make consequential amendments to the provisions made by this Bill. This allows amendments to any
“enactment passed or made before the end of the Session in which this Act is passed”.
Why is this necessary?
The Bill introduces some exciting new terminology, namely “data holder” and data “trader”. Will the Minister tell the House what these terms mean and why they need to coexist alongside the existing terminology of “data processor” and “data controller”? I certainly feel that data legislation is quite complex enough without adding overlapping new terminology if we do not really need it.
I stress once again the concerns rightly raised by my noble friend Lord Markham about NUAR security. Are the Government satisfied that the operational protection of NUAR is sufficient to protect this valuable information from terrorist and criminal threats? More generally, additional cybersecurity measures must be implemented to protect personal data during this mass digitisation push. Will the Minister tell the House how these necessary security measures will be brought forward?
Finally, as I am sure all noble Lords will recall, the previous Government published a White Paper that set out five principles for AI. As a reminder, those were: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress. I am minded to table an amendment to Clause 80, requiring those using AI in their automated decision-making process to have due regard for these five principles. I noted with interest that the noble Lord, Lord Stevenson of Balmacara, proposed something very similar but using the Bletchley principles. I am very keen to explore that further, on the grounds that it might be an interesting way of having principles-driven AI inserted into this critical Bill.
In conclusion, we on these Benches are broadly supportive of the Bill. We do, as I have set out, have a few concerns, which I hope the Minister will be willing to listen to.
(1 year, 3 months ago)
Lords ChamberWe are acutely aware of this issue. We know that there is a live ongoing argument about it and we are talking to our colleagues across government to find a way through, but we have not come to a settled view yet.
My Lords, catfishing is, of course, one of the misuses of technology in respect of which AI is rapidly enhancing both the attack and the defence. Does the Minister agree that the most effective, adaptive and future-proof defence against catfishing is actually personal awareness and resilience? If so, can the Minister provide a bit more of an update on the progress made in implementing this crucial media literacy strategy, which will be such an important part of defending us all against these attacks in future?
Ofcom published its latest vision of the media literacy strategy just a couple of months ago, so its implementation is very much in its infancy. The Government very much support it and we will work with Ofcom very closely to roll it out. So Ofcom has a comprehensive media literacy strategy on these issues, but as we all know, schools have to play their part as well: it has to be part of the curriculum. We need to make sure that children are kept safe in that way.
The noble Viscount referred to AI. The rules we have—the Online Safety Act and so on—are tech-neutral in the sense that, even if an image is AI generated, it would still fall foul of that Act; it does not matter whether it is real or someone has created it. Also, action should be taken by the social media companies to take down those images.
(1 year, 3 months ago)
Grand CommitteeMy Lords, I started my discussion on the previous instrument on a slightly negative note. I want to change gear completely now and say how nice it is to see the first of the SIs relating to the Online Safety Act come forward. I welcome that.
Having said that, may I inquire what the Government’s intention is in relation to the Parkinson rule? I think I am correct in saying that we wish to see in place an informal but constant process by the Government when they bring forward legislation under the Online Safety Act, which would be offered to the standing committees so that they could comment and make advice available to Ministers before the Secretary of State finally approved any such legislation. This would primarily be concerned with the codes of practice, but this is exactly the sort of issue, well exemplified by the noble Baroness, Lady Owen, where there is still some concern about the previous Government’s approach to this Bill.
If I recall, this rule was in one of the later amendments brought in towards the end of the process. Rather unlike the earlier stuff, which was seven years in the making, this was rushed through in rather less than seven weeks as we got to the end of discussions on the Online Safety Bill. To get the deal that we all, across the political parties, hoped would happen, and so that the country would benefit from the best possible Act we could get out of the process, there were a number of quite late changes, including the question about deepfake issues, which was not given quite the scrutiny that it could have had. Of course, we are now receiving discussion and debate on those issues, and it is important that we understand them and the process that the Government will take to try to resolve them.
This question of having consent was hotly debated by those who led on it during the time the Bill was before your Lordships’ House. I felt the arguments very clearly came out in favour of those who argued that the question of consent, as mentioned by the noble Lord, Lord Clement-Jones, really is not relevant to this. The offence is caused by the circulation of material, and the Act should contain powers sufficient for the Secretary of State to be satisfied that Ofcom, in exercising its regulatory functions, has the powers to take down this material where it is illegal.
There are two issues tied up in that. I think all of us who have spoken in this debate are concerned that we have not really got to the end of the discussion on this, and we need to have more. Whether through the Private Member’s Bill that we will hear about in December or not, the Government need to get action on that. They need to consult widely with the committees, both in the Commons and here, to get the best advice. It may well be that we need further debate and discussion in this House to do so.
Having said that, the intention to clarify what exactly is legal lies at the heart of the Online Safety Act. The Act will not work and benefit the country if we go back to the question of legal but harmful. The acid test for how the material is to be treated by those who provide services to this country has to be whether it is legal. If it is illegal, it must be taken down, and there must be powers and action specifically for that to happen. It is unfortunate that, if material is not illegal, it is a matter not for the Government or Parliament but for the companies to ensure that their terms of service allow people to make judgments about whether they put material on their platforms. I hope that still remains the Government’s position. I look forward to hearing the Minister’s response.
My Lords, I shall also start on a positive note and welcome the ongoing focus on online safety. We all aim to make this the safest country in the world in which to be online. The Online Safety Act is the cornerstone of how all of us will continue to pursue this crucial goal. The Act imposed clear legal responsibilities on social media platforms and tech companies, requiring them actively to monitor and manage the content they host. They are required swiftly to remove illegal content and to take proactive measures to prevent harmful material reaching minors. This reflects the deep commitment that we all share to safeguarding children from the dangers of cyberbullying, explicit content and other online threats.
We must also take particular account of the disproportionate harm that women and girls face online. The trends regarding the online abuse and exploitation that disproportionately affect female users are deeply concerning. Addressing these specific challenges is essential if we are to create a truly safe online environment for everyone.
With respect to the Government’s proposed approach to making sharing intimate images without consent a priority offence under the Online Safety Act, this initiative will require social media companies promptly to remove such content from their platforms. This aims to curb the rise in abuse that has been described as “intolerable”—I think rightly—by the Secretary of State. The intent behind this measure is to prevent generations becoming “desensitised” to the devastating effects of online abuse.
Although this appears to signal a strong stance against online harm, it raises the question of what this designation truly accomplishes in practical terms. I am grateful to the Minister for setting this out so clearly. I am not entirely sure that I altogether followed the differences between the old offences and the new ones. Sharing intimate images without consent is already illegal under current laws. Therefore, can we not say that the real issue lies in the absence not of legal provision but of effective enforcement of existing regulation? We have to ensure that any changes we make do not merely add layers of complexity but genuinely strengthen the protections available to victims and improve the responsiveness of platforms in removing harmful content.
With these thoughts in mind, I offer five questions. I apologise; the Minister is welcome to write as necessary, but I welcome her views whether now or in writing. First, why is it necessary to add the sharing of intimate images to the list of priority offences if such acts are already illegal under existing legislation and, specifically, what additional protections or outcomes are expected? The Minister gave some explanation of this, but I would welcome digging a little deeper into that.
Secondly, where consent is used as a defence against the charge of sharing intimate images, what are the Government’s thoughts on how to protect victims from intrusive cross-examination over details of their sexual history?
Thirdly, with respect to nudification technology, the previous Government argued that any photoreal image was covered by “intimate image abuse”—the noble Lord, Lord Clement-Jones, touched on this issue well. Is there any merit in looking at that again?
Fourthly, I am keen to hear the Government’s views on my noble friend Lady Owen’s Private Member’s Bill on nudification. We look forward to debating that in December.
Fifthly, and lastly, what role can or should parents and educators play in supporting the Act’s objectives? How will the Government engage these groups to promote online safety awareness?
My Lords, I thank noble Lords for their contributions to this debate. This is, as I think all noble Lords who have spoken recognise, a really important issue. It is important that we get this legislation right. We believe that updating the priority offences list with a new intimate image abuse offence is the correct, proportionate and evidence-led approach to tackle this type of content, and that it will provide stronger protections for online users. This update will bring us closer to achieving the commitment made in the Government’s manifesto to strengthening the protection for women and girls online.
I will try to cover all the questions asked. My noble friend Lord Stevenson and the noble Baroness, Lady Owen, asked whether we will review the Act and whether the Act is enough. Our immediate focus is on getting the Online Safety Act implemented quickly and effectively. It was designed to tackle illegal content and protect children; we want those protections in place as soon as possible. Having said that, it is right that the Government continually assess the law’s ability to keep up, especially when technology is moving so fast. We will of course look at how effective the protections are and build on the Online Safety Act, based on the evidence. However, our message to social media companies remains clear: “There is no need to wait. You can and should take immediate action to protect your users from these harms”.
The noble Baroness, Lady Owen, asked what further action we are taking against intimate abuse and about the taking, rather than sharing, of intimate images. We are committed to tackling the threat of violence against women and girls in all forms. We are considering what further legislative measures may be needed to strengthen the law on taking intimate images without consent and image abuse. This matter is very much on the Government’s agenda at the moment; I hope that we will be able to report some progress to the noble Baroness soon.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Owen, asked whether creating and making intimate image deepfakes will be an offence. The Government’s manifesto included a commitment to banning the creation of sexually explicit deepfakes. This is a priority for the Government. DSIT is working with the Home Office and the Ministry of Justice to identify the most appropriate legislative vehicle for ensuring that those who create these images without consent face the appropriate punishment. The Government are considering options in this space to protect women and girls from malicious uses of these technologies. The new sharing intimate images offence, which will be added to the OSA priority list through this SI, explicitly includes—for the first time—wholly synthetic manufactured images, such as deepfakes, so they will be tackled under the Online Safety Act.
The noble Baroness, Lady Owen, asked about the material that is already there and the ability to have a hash database to prevent those intimate images continually being circulated. We are aware that the technology exists. Strengthening the intimate image abuse priorities under the Act is a necessary first step to tackling this, but we expect Ofcom to consider this in its final draft illegal content codes and guidance and to give more information about both the codes of practice and the further measures that would need to be developed to address this issue.
Several noble Lords—the noble Viscount, Lord Camrose, the noble Lord, Lord Clement-Jones, and my noble friend Lord Stevenson—asked for more details on the new offences. As I tried to set out in my opening statement, the Online Safety Act repeals the offence of disclosing private sexual photographs and films with the intent to cause distress—this comes under Section 33 of the Criminal Justice and Courts Act 2015 and is commonly known as the revenge porn offence—and replaces it with four new offences.
First, there is a base offence of sharing an intimate image without consent, which carries a maximum penalty of six months’ imprisonment. Secondly, there are two specific-intent offences—the first is sharing an intimate image with intent to cause alarm, humiliation or distress; the second is sharing an intimate image for the purpose of obtaining sexual gratification—each of which carries a maximum penalty of two years’ imprisonment to reflect the more serious culpability of someone who acts without consent and with an additional malign intent. Lastly, there is an offence of threatening to share an intimate image, with a maximum penalty of two years’ imprisonment. This offence applies regardless of whether the image is shared.
These offences capture images that show, or appear to show, a person who is nude, partially nude, engaged in toileting or doing something sexual. These offences include the sharing of manufactured or manipulated images, which are referred to as deepfakes. This recognises that sharing intimate images without the consent of the person they show or appear to show is sufficiently wrongful or harmful to warrant criminalisation.
The noble Viscount, Lord Camrose, asked what is so different about these new offences compared to those in the Act. I stress that it is because they are being given priority status, which does not sound much but gives considerable extra powers under the Act. There will be new powers and new obligations on platforms. The key thing is that all those offences that already exist are being given priority status under the Online Safety Act. There are thousands of things that Ofcom could address, but this is now in the much smaller list of things that will place very specific obligations on the platforms. Ofcom will monitor this and, as I said earlier, companies can be fined huge sums of money if they do not act, so there is a huge obligation on them to follow through on the priority list.
I hope that I have answered all the questions and that noble Lords agree with me on the importance of updating the priority offences in the Online Safety Act. The noble Viscount, Lord Camrose, asked about parents and made an important point. This is not just about an Act, it is about everybody highlighting the fact that these activities are intolerable and offensive not just to the individuals concerned but to everybody in society, and parents have a responsibility, as we all do, to ensure that media literacy is at the height of the education we carry out formally in schools and informally within the home. The noble Viscount is absolutely right on that, and there is more that we could all do. I commend these regulations to the Committee.
(1 year, 3 months ago)
Grand CommitteeMy Lords, I begin with a comment that I hope will not be taken badly by either my noble friend the Minister or the large number of civil servants who have been involved in this Bill over the years. Colleagues may recall that the Bill took seven years to pass through the various processes and procedures of Parliament, including initial Green Papers and White Papers and then scrutiny by the Joint Select Committee, of which my noble friend opposite was also a member, and it seems slightly surprising and a bit odd that we are dealing with what seems to be an administrative oversight so late in the process. I do not expect a serious response from the Minister on that, but I wanted to put on the record that we are still very much aware of the fact that legislation has its faults and sometimes needs to be corrected, and we should perhaps be humble in expecting that the material we finally agree in Parliament is indeed the last word on things.
Having said that, I think I follow the noble Lord, Lord Clement-Jones, on this point: the subsequent legal analysis, which has identified a potential gap in provision on this instrument, tries to tidy it up but, in doing so, has left me a bit confused. I simply ask the Minister to make it clear to me when she responds that I am reading it correctly. The worry that has been exposed by this subsequent legal analysis is about the sharing of information when Ofcom is using its powers to address issues with the companies with which it has an engagement. Indeed, the whole purpose of the Bill is to ensure that companies are taking their burden of making sure that the Bill works in practice. There may be a deficiency in terms of what the Secretary of State has separate powers to do, but my confusion is that the Explanatory Memorandum says:
“The Secretary of State has several key functions relating to the implementation of the framework under the”
Online Safety Act. It is obviously sensible, therefore, that the sharing of information that Ofcom gathers is available for that. But is that all the powers of the Secretary of State or only the powers of the Secretary of State in relation to the Online Safety Act? The Explanatory Memorandum says:
“If Ofcom were not able to share business information relating to these areas”—
that is, the areas directly affected by the Online Safety Act—
“there is a risk that implementation and review of the framework could be delayed or ineffective”.
I accept the general point, but, to pull up the point made by the noble Lord, Lord Clement-Jones, is this an open invitation for Ofcom to share information that does not relate to its powers in relation to the Online Safety Act with the Secretary of State and, therefore, something for the Secretary of State to take on as a result of a slightly uncertain way of doing it? Are there are any restrictions to this power as set out in that paper? I could mention other points where it comes up, but I think my point is made.
The noble Lord, Lord Clement-Jones, also touched on the point that this is a power for Ofcom to share with the Secretary of State responsible for Ofcom, which is fair enough, but, as the Explanatory Memorandum points out:
“There are also certain functions relating to definitions conferred on Scottish and Welsh Ministers and Northern Ireland departments”—
presumably now Ministers—which may also be “relevant persons” of the Act, but we are not given much on that, except that
“these are unlikely to require business information for their exercise”.
I would like a bit more assurance on that. Again, that might be something for which the department is not prepared and I am quite happy to receive a letter on it, but my recollection from the discussions on the Online Safety Bill in this area, particularly in relation to Gaelic, was that there were quite a lot of powers that only Scottish Ministers would be able to exercise, and therefore it is quite possible that business activities which would not be UK-wide in their generality and therefore apropos of the Secretary of State might well be available to Ofcom to share with Scottish Ministers. If it is possible to get some generic points about where that is actually expected to fall, rather than simply saying that it is unlikely to require business information, I would be more satisfied with that.
My Lords, I thank the Minister for setting out this instrument so clearly. It certainly seems to make the necessary relatively simple adjustments to fill an important gap that has been identified. Although I have some questions, I will keep my remarks fairly brief.
I will reflect on the growing importance of both the Online Safety Act and the duty we have placed on Ofcom’s shoulders. The points made by the noble Lord, Lord Clement-Jones, about the long-standing consequential nature of the creation of Ofcom and the Communications Act were well made in this respect. The necessary complexity and scope of the work of Ofcom, as our online regulator, has far outgrown what I imagine was foreseeable at the time of its creation. We have given it the tasks of developing and enforcing safety standards, as well as issuing guidance and codes of practice that digital services must follow to comply with the Act. Its role includes risk assessment, compliance, monitoring and enforcement, which can of course include issuing fines or mandating changes to how services operate. Its regulatory powers now allow it to respond to emerging online risks, helping to ensure that user-protection measures keep pace with changes in the digital landscape.
In recognising the daily growing risk of online dangers and the consequent burdens on Ofcom, we of course support any measures that bring clarity and simplicity. If left unaddressed, the identified gap here clearly could lead to regulatory inefficiencies and delays in crucial processes that depend on accurate and up-to-date information. For example, setting appropriate fee thresholds for regulated entities requires detailed knowledge of platform compliance and associated risks, which would be challenging to achieve without full data access. During post-implementation reviews, a lack of access to necessary business information could hamper the ability to assess whether the Act is effectively achieving its safety objectives or whether adjustments are needed.
That said, I have some questions, and I hope that, when she rises, the Minister will set out the Government’s thinking on them. My first question very much picks up on the point made—much better than I did—by the noble Lord, Lord Stevenson of Balmacara. It is important to ensure that this instrument does not grant unrestricted access to business information but, rather, limits sharing to specific instances where it is genuinely necessary for the Secretary of State to fulfil their duties under the Act. How will the Government ensure this?
Secondly, safeguards, such as data protection laws and confidentiality obligations under the Communications Act 2003, must be in place to guarantee that any shared information is handled responsibly and securely. Do the Government believe that sufficient safeguards are already in place?
Thirdly, in an environment of rapid technology change, how do the Government plan to keep online safety regulation resilient and adaptive? I look forward to hearing the Government’s views on these questions, but, as I say, we completely welcome any measure that increases clarity and simplicity and makes it easier for Ofcom to be effective.
I thank noble Lords for their valuable contributions to this debate. It goes without saying that the Government are committed to the effective implementation of the Online Safety Act. It is critical that we remove any barriers to that, as we are doing with this statutory instrument.
As noble Lords said—the noble Viscount, Lord Camrose, stressed this—the Online Safety Act has taken on a growing significance in the breadth and depth of its reach. It is very much seen as an important vehicle for delivering the change that the whole of society wants now. It is important that we get this piece of legislation right. For that purpose, this statutory instrument will ensure that Ofcom can co-operate and share online safety information with the Secretary of State where it is appropriate to do so, as was intended during the Act’s development.
On specific questions, all three noble Lords who spoke asked whether the examples given were exclusive or whether there are other areas where powers might be given to the Secretary of State. The examples given are the two areas that are integral to implementation. We have not at this stage identified any further areas. The instrument would change to allow sharing only for the purposes of fulfilling the Secretary of State’s functions under the Online Safety Act—it does not go any broader than that. I think that answers the question asked by the noble Viscount, Lord Camrose, about whether this meant unlimited access—I assure him that that is not the purpose of this SI.
My noble friend Lord Stevenson asked whether this relates only to the powers under the OSA. Yes, the instrument allows Ofcom to share information it has collected from businesses only for the purposes of fulfilling the Secretary of State’s functions under the Act.
On the question of devolution, the powers of Scottish, Northern Ireland and Welsh Ministers primarily relate to the power to define the educational establishments for the purpose of Schedule 1 exemptions. There are also some consultation provisions where these Ministers must be consulted, but that is the limit of the powers that those Ministers would have.
I am conscious that I have not answered all the questions asked by the noble Viscount, Lord Camrose, because I could not write that quickly—but I assure him that my officials have made a note of them and, if I have not covered those issues, I will write to him.
I hope that noble Lords agree with me on the importance of implementing the Online Safety Act and ensuring that it can become fully operational as soon as possible. I commend these regulations to the Committee.
(1 year, 4 months ago)
Lords ChamberThe noble Lord raises an important point. Where nudification apps and other material do not come under the remit of the Online Safety Act, we will look at other legislative tools to make sure that all new forms of technology—including AI and its implications for online images —are included in robust legislation, in whatever form it takes. Our priority is to implement the Online Safety Act, but we are also looking at what other tools might be necessary going forward. As the Secretary of State has said, this is an iterative process; the Online Safety Act is not the end of the game. We are looking at what further steps we need to take, and I hope the noble Lord will bear with us.
What is the Government’s assessment of the technical difficulties behind requiring pornography sites and others to implement age-verification services?
(1 year, 5 months ago)
Lords ChamberI join other noble Lords in thanking the noble Lord, Lord Redesdale, for bringing this hugely important topic to us today. I thank all noble Lords who have spoken so well and so fascinatingly in this debate. I appreciated the horrifying example offered by the noble Lord, Lord Berkeley, of the fire on board a ship started by an electric vehicle. My noble friend Lord Holmes was on the money, as ever, in calling for an overall governmental battery strategy. The noble Baroness, Lady Finlay, and the noble Lord, Lord Foster, also stressed the importance of working with the online marketplaces, as they are a really dangerous source of some of these unsafe items.
The noble Lord, Lord Winston, in addition to giving an—in my case, rather overdue—erudite chemistry lesson about batteries, made a very important point about the role that education can play in driving the safety of batteries. The noble Baroness, Lady Brinton, powerfully supported this argument with accounts of her own. I was struck also by the call of the noble Earl, Lord Erroll, for mandating specialist fire extinguishers; that is a very interesting idea.
I am also grateful to the London Fire Brigade for its comprehensive briefing, and to the other groups which gave us briefings. Like, I think, almost every other speaker, I latched on to the statistic of 143 e-bike fires—a fire every two days—in London, resulting in three deaths and 60 injuries. It is a very powerful statistic, and we should really take note of it. I will not talk more about the importance of safety because other noble Lords have made this point so clearly and so well.
I turn to the market for lithium-ion batteries. The global market is expected to grow from $56.8 billion last year to $187 billion in 2032. I hope and think that the UK has a significant role to play in the safe development of this huge and hugely important industry. Without it, we will not realise our ambitions for electric vehicles; renewable energy and storage; tech innovation of all kinds; environmental and productivity improvements to manufacturing and supply chains; the circular economy and recycling; and a range of export opportunities. So we on these Benches absolutely support the Bill’s goals. It is essential for both safety and growth that lithium-ion batteries are safe.
That said, we need to be satisfied on a range of questions about the Bill’s workability, effectiveness and proportionality. First, how does this work alongside both existing and planned legislation? As others have raised, I ask the Minister to give the Government’s assessment of the existing product safety laws. I believe that lithium-ion batteries are already subject to the Electrical Equipment (Safety) Regulations 2016. Are these regulations inadequate, or is there an issue of enforcement? The Bill proposes a role for conformity assessment bodies, and I would welcome more clarity on the role of these CABs, as opposed to the OPSS and the local authority trading standards, both of which have relevant enforcement powers. Like other noble Lords, I look forward to hearing more about the Government’s Product Regulation and Metrology Bill, particularly with respect to lithium-ion battery safety. I would certainly welcome a sense from the Minister of the overlap between the two Bills.
Secondly, we are not, of course, the only nation wrestling with battery safety. I would welcome the chance to understand the Government’s view of the international context: which countries have implemented the most effective regulatory systems, and what can we learn from them? I was interested to hear briefly from the noble Baroness, Lady Finlay—I am sure she has more to say on the subject—about some of the regulations in New York in this respect.
Thirdly, how do we ensure that these regulations are proportionate? A BESS can be anything from an enormous industrial site to a domestic appliance. I would welcome the views of the noble Lord, Lord Redesdale, on the applicability of the same regulation and the same enforcement bodies to these very different participants in the marketplace. Equally, will it really be necessary or achievable for a proposed BESS to consult separately with three different public bodies in seeking approval? Is this an appropriate way to achieve our growth means?
Finally, is it appropriate to focus solely on lithium-ion batteries? A number of noble Lords raised this, particularly the noble Lords, Lord Winston and Lord Holmes, who spoke compellingly on the different chemistries of other batteries and how they may also form part of this legislation.
Lithium-ion batteries are important, but their safety clearly needs to improve. As I have set out, we have some questions about the approach taken in the Bill, and I look forward to hearing from both the Minister and the noble Lord, Lord Redesdale.
(1 year, 6 months ago)
Lords ChamberThe noble Lord is right that there are issues around the risks in the way he has spelled out. There are still problems around the risks to accuracy of some AI systems. We are determined to push forward to protect people from those risks, while recognising the enormous benefits that there are from introducing AI. The noble Lord will know I am sure that it has a number of positive benefits in areas such as the health service, diagnosing patients more quickly—for example, AI can detect up to 13% more breast cancers than humans can. So there are huge advantages, but we must make sure that whatever systems are in place are properly regulated and that the risks are factored into that. Again, that will be an issue we will debate in more detail when the draft legislation comes before us.
My Lords, let me start by warmly welcoming the Minister to her new, richly deserved Front-Bench post. I know that she will find the job fascinating. I suspect she will find it rather demanding as well, but I look forward to working with her.
I have noted with great interest the Government’s argument that more AI-specific regulation will encourage more investment in AI in the country. That would be most welcome, but what do the Government make of the enormous difference between AI investment to date in the UK versus in the countries of the European Union subject to the AI Act? In the same vein, what do the Government make of Meta’s announcement last week that it is pausing some of its AI training activities because of the cumbersome and not always very clear regulation that is part of the AI Act?
Again, I thank the noble Viscount for his good wishes and welcome him to his new role. He is right to raise the comparison and, while the EU has introduced comprehensive legislation, we instead want to bring forward highly targeted legislation that focuses on the safety risks posed by the most powerful models. We are of course committed to working closely with the EU on AI and we believe that co-ordinating with international partners —the EU, the US and other global allies—is critical to making sure that these measures are effective.