Read Bill Ministerial Extracts
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Business and Trade
(1 month ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Lucas, for injecting a bit of reality into this discussion. I declare my interest as a governor of Coram. I thank the noble Lord very much for his comments about the Cross Benches. Perhaps if we let HOLAC choose the Cross Benches and Copilot the political appointees, we might be slightly more successful than we have been in recent years.
I am conscious that I am the only person between your Lordships and the joys of the Front-Bench spokespeople, so I shall not take too long. I welcome the Bill, but I have some concerns. In particular, in reading through the introduction on page 1 line by line, I counted 12 instances of the Bill being “to make provision” and not a single specific mention of protection, which I feel is perhaps a slight imbalance.
I have six areas of concern that I suspect I and many others will want to explore in Committee. I am not going to dive into the detail at this stage, because I do not think it is appropriate.
Like many noble Lords, including the noble Lords, Lord Knight, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas, I have some concerns about the extension of the UK’s data adequacy status beyond June next year. Given that one of the key objectives of this Bill is to help economic growth, it is incredibly important that that happens smoothly. It is my hope and expectation that the new His Majesty’s Government will find it slightly less painful and more straightforward to talk to some of our colleagues across the water in the EU to try to understand what each side is thinking and to ease the way of making that happen.
Secondly, like many noble Lords, I have a lot of concern about what is rather inelegantly known as “web scraping”—a term that sounds to me rather like an unpleasant skin rash—of international property and its sheer disregard for IP rights, so undermining the core elements of copyright and the value of unique creative endeavour. It is deeply distasteful and very harmful.
One area that I hope we will move towards in Committee is the different departments of His Majesty’s Government that have an interest in different parts of this Bill consciously working together. In the case of web scraping, I think the engagement of Sir Chris Bryant in his dual role as Minister of State for Data Protection and Minister for Creative Industries will be very important. I hope and expect that the Minister and her colleagues will be able to communicate as directly as possible and have a sort of interplay between us in Committee and that department to make sure that we are getting the answers that we need. It is frankly unfair on the Minister, given that she is covering the ground of I-do-not-know-how-many Ministers down the other end, for her also to take on board the interests, concerns and views of other departments, so I hope that we can find a way of managing that in an integrated way.
Thirdly, like my noble friend Lady Kidron, I am appalled by AI-produced child sexual abuse material. To that extent, I make a direct request to the Minister and the Bill team that they read an article published on 18 October in the North American Atlantic magazine by Caroline Mimbs Nyce, The Age of AI Child Abuse is Here. She writes about what is happening the USA, but it is unpleasantly prescient. She writes,
“child-safety advocates have warned repeatedly that generative AI is now being widely used to create sexually abusive imagery of real children”—
not AI avatars—
“a problem that has surfaced in schools across the country”.
It is certainly already happening here. It will be accelerating. Some of your Lordships may have read about a scandal that has emerged in South Korea. My daughter-in-law is South Korean. AI-created adult sexual material has caused major trauma and a major decline in female mental health. In addition, when it comes to schools, there are real concerns about the ability of companies to scoop up photographic information about children from photos that schools have on their own websites or Facebook pages. Those images can then potentially be used for variety of very unpleasant reasons, so I think that is an area which we would want to look at very carefully.
Fourthly, there are real concerns about the pervasive spread of educational technology—edtech, as it is known informally—driven, understandably, by commercial rather than educational ambition in many cases. We need to ensure that the age-appropriate design code applies to edtech and that is something we should explore. We need to prioritise the creation of a code of practice for edtech. We know of many instances where children’s data has been collected in situations where the educational establishments themselves, although they are charged with safeguarding, are wholly inadequate in trying to do it, partly because they do not really understand it and partly because they do not necessarily have the expertise to do it. It is unacceptable that children in school, a place that should be a place of safety, are inadvertently exposed to potential harm because schools do not have the power, resources and knowledge to protect the children for whom they are responsible. We need to think carefully about what we need to do to enhance their ability to do that.
On my fifth concern, the noble Lords, Lord Stevenson and Lord Holmes, made a very good point, in part about the Bletchley declaration. It would be helpful for us as a country and certainly as Houses of Parliament to have some idea of where the Government think we are going. I understand that the new Government are relatively recently into their first term and are somewhat cautious about saying too much about areas that they might subsequently regret, but I think there is a real appetite for a declaratory vision with a bit of flesh on it. We all understand that it might need to change as AI, in particular, threatens to overtake it, but having a stab at saying where we are, what we are doing, why we are doing it, the direction of travel and what we are going to do to modify it as we go along, because we are going to have to because of AI, would be helpful and, frankly, reassuring.
Lastly, during the passage of the Online Safety Bill, many of us tried to make the case for a Joint Committee to oversee digital regulation and the regulators themselves. I think it would be fair to say that the experience of those of us who were particularly closely involved with what is now the Online Safety Act and the interactions that we have had formally or informally with the regulator since then, and the frustrations that have emerged from those interactions, have demonstrated the value of having a joint statutory committee with all the powers that it would have to oversee and, frankly, to call people to account. It would really concentrate minds and make the implementation of that Act, and potentially this Act, more streamlined, more rapid and more effective. It could be fine-tuned thereafter much more effectively, in particular if we are passing a Bill that I and my fellow members of the Secondary Legislation Scrutiny Committee will have the joy of looking at in the form of statutory instruments. Apart from anything else, having a Joint Committee keep a close watch on the flow of statutory instruments would be enormously helpful.
As we are dealing with areas which are in departments that are not immediately within the remit of the Minister, such as the Department for Education given what I was talking about with schools, anything we can do to make it clear that the left hand knows what the right hand is doing would be extraordinarily helpful. I think there have been precedents in particular cases in Committee when we are dealing with quite detailed amendments for colleagues from other departments to sit on the Bench alongside the Minister to provide real departmental input. That might be a course that we could fruitfully follow, and I would certainly support it.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Business and Trade
(1 week, 6 days ago)
Grand CommitteeMy Lords, I put my name to the amendments from the noble Baroness, Lady Kidron, and will briefly support them. I state my interest as a governor of Coram, the children’s charity. One gets a strong sense of déjà vu with this Bill. It takes me back to the Online Safety Bill and the Victims and Prisoners Bill, where we spent an inordinate amount of time trying to persuade the Government that children are children and need to be treated as children, not as adults. That was hard work. They have an absolute right to be protected and to be treated differently.
I ask the Minister to spend some time, particularly when her cold is better, with some of her colleagues whom we worked alongside during the passage of those Bills in trying to persuade the then Government of the importance of children being specifically recognised and having specific safeguards. If she has time to talk to the noble Lords, Lord Ponsonby, Lord Stevenson and Lord Knight, and the noble Baroness, Lady Thornton —when she comes out of hospital, which I hope will be soon—she will have chapter, book and verse about the arguments we used, which I hope we will not have to rehearse yet again in the passage of this Bill. I ask her please to take the time to learn from that.
As the noble Baroness said, what is fundamental is not what is hinted at or implied at the Dispatch Box, but what is actually in the Bill. When it is in the Bill, you cannot wriggle out of it—it is clearly there, stating what it is there for, and it is not open to clever legal interpretation. In a sense, we are trying to future-proof the Bill by, importantly, as she said, focusing on outcomes. If you do so, you are much nearer to future-proofing than if you focus on processes, which by their very nature will be out of date by the time you have managed to understand what they are there to do.
Amendment 135 is important because the current so-called safeguard for the Information Commissioner to look after the interests of children is woefully inadequate. One proposed new section in Clause 90 talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”.
It is not just children; most adults do not have a clue about any of that, so to expect children to have even the remotest idea is just a non-starter. To add insult to injury, that new section begins
“the Commissioner must have regard to such of the following”—
of which the part about children is one—
“as appear to the Commissioner to be relevant in the circumstances”.
That is about as vague and weaselly as it is possible to imagine. It is not adequate in any way, shape or form.
In all conscience, I hope that will be looked at very carefully. The idea that the commissioner might in certain circumstances deem that the status and importance of children is not relevant is staggering. I cannot imagine a circumstance in which that would be the case. Again, what is in the Bill really matters.
On Amendment 94, not exempting the provision of information regarding the processing of children’s data is self-evidently extremely important. On Amendment 82, ring-fencing children’s data from being used by a controller for a different purpose again seems a no-brainer.
Amendment 196, as the noble Lord, Lord Clement-Jones, says, is a probing amendment. It seems eminently sensible when creating Acts of Parliament that in some senses overlap, particularly in the digital and online world, that the left hand should know what the right hand is doing and how two Acts may be having an effect on one another, perhaps not in ways that had been understood or foreseen when the legislation was put forward. We are looking for consistency, clarity, future-proofing and a concentration on outputs, not processes. First and foremost, we are looking for the recognition, which we fought for so hard and finally got, that children are children and need to be recognised and treated as children.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Business and Trade
(1 week ago)
Grand CommitteeMy Lords, I was not going to rise at all for the moment because there are other amendments coming later that are of interest. I declare my rather unusual interest: I was one of the architects of the GDPR in Brussels.
I rise to support Amendment 211A in the name of my noble friend Lord Holmes because here we are referring to AI. I know that other remarks have now been passed on this matter, which we will come to later, but it seems to me—this has come straight into my mind—that, when the preparation of the data legislation and the GDPR was being undertaken, we really did fail at that stage to accommodate the vast and important areas that AI brings to the party, as it were. We will fail again, I suspect, if we are not careful, in this piece of legislation. AI is with us now and moving at an enormous pace—faster than any legislator can ever manage to keep up with in order to control it and to make sure that there are sufficient protections in place for both the misuse of this technology and the way it may develop. So I support this amendment, particularly in relation to the trading or use of likenesses and the algorithmic effects that come about.
We will deal with that matter later, but I hope that the Minister will touch on this, particularly having heard the remarks of my noble friend Lord Holmes—and, indeed, the remarks of my noble friend Lady Harding a moment ago—because AI is missing. It was missing in the GDPR to a large extent. It is in the European Union’s new approach and its regulations on AI, but the EU has already shown that it has enormous difficulties in trying to offer, at one stage, control as well as redress and the proper involvement of human beings and individual citizens.
My Lords, I rise briefly to support my noble friend Lady Kidron on Amendment 137. The final comments from the noble and learned Lord, Lord Thomas, in our debate on the previous group were very apposite. We are dealing with a rapidly evolving and complex landscape, which AI is driving at warp speed. It seems absolutely fundamental that, given the panoply of different responsibilities and the level of detail that the different regulators are being asked to cover, there is on the face of what they have to do with children absolute clarity in terms of a code of practice, a code of conduct, a description of the types of outcomes that will be acceptable and a description of the types of outcomes that will be not only unacceptable but illegal. The clearer that is in the Bill, the more it will do something to future-proof the direction in which regulators will have to travel. If we are clear about what the outcomes need to be in terms of the welfare, well-being and mental health of children, that will give us some guidelines to work within as the world evolves so quickly.
My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.
As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.
My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Business and Trade
(5 days, 14 hours ago)
Grand CommitteeI shall speak very briefly, because the previous three speakers have covered the ground extremely well and made some extremely powerful arguments.
The noble Baroness, Lady Kidron, put her finger on it. The default position of departments such as the DfE, if they recognise there is a problem, is to issue guidance. Schools are drowning in guidance. If you talk to any headmaster or headmistress or the staff in charge of technology and trying to keep on top of it, they are drowning in guidance. They are basically flying blind when being asked to take some quite major decisions, whether it is about purchasing or the safeguards around usage or about measuring the effectiveness of some of the educational technology skills that are being acquired.
There is a significant difference between guidance and a clear and concrete code. We were talking the other day, on another group, about the need to have guardrails, boundaries and clarity. We need clarity for schools and for the educational technology companies themselves to know precisely what they can and cannot do. We come back again to the issue of the necessity of measuring outcomes, not just processes and inputs, because they are constantly changing. It is very important for the companies themselves to have clear guardrails.
The research to which the noble Baroness, Lady Kidron, referred, which is being done by a variety of organisations, found problems in the areas that we are talking about in this country, the United States, Iceland, Denmark, Sweden, the Netherlands, Germany and France—and that is just scratching the surface. Things are moving very quickly and AI is accelerating that even more. With a code you are drawing a line in the sand and declaring very clearly what you expect and do not expect, what is permissible and not permissible. Guidance is simply not sufficient.
My Lords, I make a brief intervention. I am not against these amendments —they are very useful in the context of the Bill. However, I am reflecting on the fact that, when we drafted GDPR, we took a six-year process and failed in the course of doing so to really accommodate AI, which keeps popping up every so often in this Bill. Every part of every amendment seems to have a new subsection referring to automative decisions or to AI generally.
Obviously, we are moving on to have legislation in due course on AI and I am sure that a number of pieces of legislation, including no doubt this one, will be able to be used as part of our overall package when we deal with the regulation of AI. However, although it is true that the UK GDPR gives, in theory, a higher standard of protection for children, it is important to consider that, in the context of AI, the protections that we need to have are going to have to be much greater—we know that. But if there is going to be a code of practice for children and educational areas, we need also to consider vulnerable and disabled people and other categories of people who are equally entitled to have, and particularly with regard to the AI elements need to have, some help. That is going to be very difficult. Most adults whom I know know less about AI than do children approaching the age of 18, who are much more knowledgeable. They are also more knowledgeable of the restrictions that will have to be put in place than are adults, who appear to be completely at sea and not even understanding what AI is about.
I make a precautionary point. We should be very careful, while we have AI dotted all the way through this, that when we specify a particular element—in this case, for children—we must be aware of the need to have protection in place for other groups, particularly in the context of this Bill and, indeed, future legislation.
My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.
As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.
Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.
I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a
“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]
That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.
To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.
I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.
Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent
“research into online safety matters”,
as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.
Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.
Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.
My Lords, I will speak briefly. I added my name in support of Amendments 197 and 198, tabled by the noble Baroness, Lady Kidron. We do not need to rehearse the arguments as to why children are a distinct group who need to be looked at in a distinctive way, so I will not repeat those arguments.
I turn to the excellent points made in the amendments in the name of the noble Lord, Lord Bethell. Data access for researchers is fundamental. The problem with statutory bodies, regulators and departments of state is that they are not designed and set up to be experts in researching some of the more arcane areas in which these algorithms are developed. This is leading-edge stuff. The employees in these platforms—the people who are designing and tweaking these very clever algorithms—are coming from precisely the academic and research institutions that are best placed to go into those companies and find out what they are doing. In many cases, it is their own graduates and PhDs who are doing it. They are the best qualified people to look at what is going on, because they will understand what is going on. If somebody tries to obfuscate, they will see through them immediately, because they can understand that highly sophisticated language.
If we do not allow this, we will be in the deeply uncomfortable position of relying on brave people such as Frances Haugen to run the huge reputational, employability and financial risks of becoming a whistleblower. A whistleblower who takes on one of those huge platforms that has been employing them is a very brave person indeed. I would feel distinctly uncomfortable if I thought that we were trying to guard our citizens, and particularly our children, against what some of these algorithms are trying to do by relying on the good wishes and chances of a whistleblower showing us what was going on. I support all these amendments very strongly.
My Lords, before we proceed, I draw to the attention of the Committee that we have a hard stop at 8.45 pm and we have committed to try to finish the Bill this evening. Could noble Lords please speak quickly and, if possible, concisely?
My Lords, I support my noble friend Lady Kidron’s Amendment 211, to which I have put my name. I speak not as a technophobe but as a card-carrying technophile. I declare an interest as, for the past 15 years, I have been involved in the development of algorithms to analyse NHS data, mostly from acute NHS trusts. This is possible under current regulations, because all the research projects have received medical research ethics approval, and I hold an honorary contract with the local NHS trust.
This amendment is, in effect, designed to scale up existing provisions and make sure that they are applied to public sector data sources such as NHS data. By classifying such data as sovereign data assets, it would be possible to make it available not only to individual researchers but to industry—UK-based SMEs and pharmaceutical and big tech companies—under controlled conditions. One of these conditions, as indicated by proposed new subsection (6), is to require a business model where income is generated for the relevant UK government department from access fees paid by authorised licence holders. Each government department should ensure that the public sector data it transfers to the national data library is classified as a sovereign data asset, which can then be accessed securely through APIs acting
“as bridges between each sovereign data asset and the client software of the authorized licence holders”.
In the time available, I will consider the Department of Health and Social Care. The report of the Sudlow review, Uniting the UK’s Health Data: A Huge Opportunity for Society, published last month, sets out what could be achieved though linking multiple NHS data sources. The Academy of Medical Sciences has fully endorsed the report:
“The Sudlow recommendations can make the UK’s health data a truly national asset, improving both patient care and driving economic development”.
There is little difference, if any, between health data being “a truly national asset” and “a sovereign asset”.
Generative AI has the potential to extract clinical value from linked datasets in the various secure data environments within the NHS and to deliver a step change in patient care. It also has the potential to deliver economic value, as the application of AI models to these rich, multimodal datasets will lead to innovative software products being developed for early diagnosis and personalised treatment.
However, it seems that the rush to generate economic value is preceding the establishment of a transparent licensing system, as in proposed new subsection (3), and the setting up of a coherent business model, as in proposed new subsection (6). As my noble friend Lady Kidron pointed out, the provisions in this amendment are urgently needed, especially as the chief data and analytics officer at NHS England is reported as having said, at a recent event organised by the Health Service Journal and IBM, that the national federated data platform will soon be used to train different types of AI model. The two models mentioned in the speech were OpenAI’s proprietary ChatGPT model and Google’s medical AI, which is based on its proprietary large language model, Gemini. So, the patient data in the national federated data platform being built by Palantir, which is a US company, is, in effect, being made available to fine-tune large language models pretrained by OpenAI and Google—two big US tech companies.
As a recent editorial in the British Medical Journal argued:
“This risks leaving the NHS vulnerable to exploitation by private technology companies whose offers to ‘assist’ with infrastructure development could result in loss of control over valuable public assets”.
It is vital for the health of the UK public sector that there is no loss of control resulting from premature agreements with big tech companies. These US companies seek privileged access to highly valuable assets which consist of personal data collected from UK citizens. The Government must, as a high priority, determine the rules for access to these sovereign data assets along the lines outlined in this amendment. I urge the Minister to take on board both the aims and the practicalities of this amendment before any damaging loss of control.
My Lords, before we move on to the next group, I again remind noble Lords that we have in fact only two groups to get through because Amendment 212 will not be moved. We have about 25 minutes to get through those two groups.
Amendment 211B
That concludes the Committee’s proceedings on the Bill. I thank all noble Lords who have participated for being so co-operative.