Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Business and Trade
(1 day, 17 hours ago)
Grand CommitteeI shall speak very briefly, because the previous three speakers have covered the ground extremely well and made some extremely powerful arguments.
The noble Baroness, Lady Kidron, put her finger on it. The default position of departments such as the DfE, if they recognise there is a problem, is to issue guidance. Schools are drowning in guidance. If you talk to any headmaster or headmistress or the staff in charge of technology and trying to keep on top of it, they are drowning in guidance. They are basically flying blind when being asked to take some quite major decisions, whether it is about purchasing or the safeguards around usage or about measuring the effectiveness of some of the educational technology skills that are being acquired.
There is a significant difference between guidance and a clear and concrete code. We were talking the other day, on another group, about the need to have guardrails, boundaries and clarity. We need clarity for schools and for the educational technology companies themselves to know precisely what they can and cannot do. We come back again to the issue of the necessity of measuring outcomes, not just processes and inputs, because they are constantly changing. It is very important for the companies themselves to have clear guardrails.
The research to which the noble Baroness, Lady Kidron, referred, which is being done by a variety of organisations, found problems in the areas that we are talking about in this country, the United States, Iceland, Denmark, Sweden, the Netherlands, Germany and France—and that is just scratching the surface. Things are moving very quickly and AI is accelerating that even more. With a code you are drawing a line in the sand and declaring very clearly what you expect and do not expect, what is permissible and not permissible. Guidance is simply not sufficient.
My Lords, I make a brief intervention. I am not against these amendments —they are very useful in the context of the Bill. However, I am reflecting on the fact that, when we drafted GDPR, we took a six-year process and failed in the course of doing so to really accommodate AI, which keeps popping up every so often in this Bill. Every part of every amendment seems to have a new subsection referring to automative decisions or to AI generally.
Obviously, we are moving on to have legislation in due course on AI and I am sure that a number of pieces of legislation, including no doubt this one, will be able to be used as part of our overall package when we deal with the regulation of AI. However, although it is true that the UK GDPR gives, in theory, a higher standard of protection for children, it is important to consider that, in the context of AI, the protections that we need to have are going to have to be much greater—we know that. But if there is going to be a code of practice for children and educational areas, we need also to consider vulnerable and disabled people and other categories of people who are equally entitled to have, and particularly with regard to the AI elements need to have, some help. That is going to be very difficult. Most adults whom I know know less about AI than do children approaching the age of 18, who are much more knowledgeable. They are also more knowledgeable of the restrictions that will have to be put in place than are adults, who appear to be completely at sea and not even understanding what AI is about.
I make a precautionary point. We should be very careful, while we have AI dotted all the way through this, that when we specify a particular element—in this case, for children—we must be aware of the need to have protection in place for other groups, particularly in the context of this Bill and, indeed, future legislation.
My Lords, I will speak to my Amendments 198A and 198C to 198F. I also support Amendments 197, 198 and 198B, to which I have added my name, all of which address the issue of data for researchers.
As was put very thoughtfully by the noble Baroness, Lady Kidron, platforms are not making decisions about their services with due regard to product safety or with independent oversight. Ofcom’s work enforcing the Online Safety Act will significantly shift towards accountability, in some part, but it makes no provision at the moment on researchers’ data access, despite civil society and academic researchers being at the forefront of highlighting online harms for a decade. The anecdotes that the noble Baroness just gave were a very powerful testimony to the importance of that. We are, in fact, flying completely blind, making policy and, in this Room, legislation without data, facts and insight about the performance and algorithms that we seek to address. Were it not for the whistleblowers, we would not have anything to go on and we cannot rely on whistleblowers to guide our hands.
Rectifying this admission is in the Bill, and I am enormously grateful to the Minister and to the role of my noble friend Lord Camrose for putting it in the Bill. It is particularly important, because the situation with data for researchers has deteriorated considerably, even in the last 18 months—with Meta shutting CrowdTangle and X restricting researchers’ access to its API. The noble Baroness, Lady Kidron, spoke about what the whistleblowers think, and they think that this is going to get a lot worse in the future.
I welcome the inclusion of these provisions in the Bill. They will be totally transformational to this sector, bringing a level of access to serious analysts and academics, so we can better understand the impact of the digital world, for both good and bad. A good example of the importance of robust research to inform policy-making was the Secretary of State’s recent announcement that the Government were launching a
“research project to explore the impact of social media on young people’s wellbeing and mental health”.—[Official Report, Commons, 20/11/24; col. 250.]
That project will not be very effective if the researchers cannot access the data, so I very much hope that these provisions will be enforced before they start spending money on that.
To be effective and to have the desired effect, we need to ensure that the data for researchers regime, as described in the Bill, is truly effective and cannot be easily brushed off. That is why the Government need to accept the amendments in this group: to bring some clarity and to close loopholes in the scheme as it is outlined in the Bill.
I will briefly summarise the provisions in the amendments in my name. First, we need to make researcher access regulations enforceable in the same way as other requirements in the Online Safety Act. The enforcement provisions in that Act were strengthened considerably as it passed through this House, and I believe that the measures for data for researchers need to be given the same rocket boosters. Amendment 198D will mean that regulated services will be required to adhere to the regime and give Ofcom the power to levy proper remedial action if regulated services are obfuscating or non-compliant.
Secondly, we need to ensure that any contractual provision of use, such as a platform’s terms of service, is unenforceable if it would prevent
“research into online safety matters”,
as defined in the regulations. This is an important loophole that needs to be closed. It will protect UK researchers carrying out public interest research from nefarious litigation over terms of service violations as platforms seek to obfuscate access to data. We have seen this practice in other areas.
Thirdly, we need to clarify that researchers carrying out applicable research into online safety matters in the UK will be able to access information under the regime, regardless of where they are located. This is a basic point. Amendment 198E would bring the regime in line with the Digital Services Act of the EU and allow the world’s best researchers to study potential harm to UK users.
Ensuring robust researcher access to data contributes to a great ecosystem of investigation and scrutiny that will help to enforce an effective application of the law, while also guarding against overreach in terms of moderating speech. It is time to back UK civil society and academic researchers to ensure that policy-making and regulatory enforcement is as informed as possible. That is why I ask the Minister to support these measures.
My Lords, I will speak briefly. I added my name in support of Amendments 197 and 198, tabled by the noble Baroness, Lady Kidron. We do not need to rehearse the arguments as to why children are a distinct group who need to be looked at in a distinctive way, so I will not repeat those arguments.
I turn to the excellent points made in the amendments in the name of the noble Lord, Lord Bethell. Data access for researchers is fundamental. The problem with statutory bodies, regulators and departments of state is that they are not designed and set up to be experts in researching some of the more arcane areas in which these algorithms are developed. This is leading-edge stuff. The employees in these platforms—the people who are designing and tweaking these very clever algorithms—are coming from precisely the academic and research institutions that are best placed to go into those companies and find out what they are doing. In many cases, it is their own graduates and PhDs who are doing it. They are the best qualified people to look at what is going on, because they will understand what is going on. If somebody tries to obfuscate, they will see through them immediately, because they can understand that highly sophisticated language.
If we do not allow this, we will be in the deeply uncomfortable position of relying on brave people such as Frances Haugen to run the huge reputational, employability and financial risks of becoming a whistleblower. A whistleblower who takes on one of those huge platforms that has been employing them is a very brave person indeed. I would feel distinctly uncomfortable if I thought that we were trying to guard our citizens, and particularly our children, against what some of these algorithms are trying to do by relying on the good wishes and chances of a whistleblower showing us what was going on. I support all these amendments very strongly.
My Lords, before we proceed, I draw to the attention of the Committee that we have a hard stop at 8.45 pm and we have committed to try to finish the Bill this evening. Could noble Lords please speak quickly and, if possible, concisely?
My Lords, I support my noble friend Lady Kidron’s Amendment 211, to which I have put my name. I speak not as a technophobe but as a card-carrying technophile. I declare an interest as, for the past 15 years, I have been involved in the development of algorithms to analyse NHS data, mostly from acute NHS trusts. This is possible under current regulations, because all the research projects have received medical research ethics approval, and I hold an honorary contract with the local NHS trust.
This amendment is, in effect, designed to scale up existing provisions and make sure that they are applied to public sector data sources such as NHS data. By classifying such data as sovereign data assets, it would be possible to make it available not only to individual researchers but to industry—UK-based SMEs and pharmaceutical and big tech companies—under controlled conditions. One of these conditions, as indicated by proposed new subsection (6), is to require a business model where income is generated for the relevant UK government department from access fees paid by authorised licence holders. Each government department should ensure that the public sector data it transfers to the national data library is classified as a sovereign data asset, which can then be accessed securely through APIs acting
“as bridges between each sovereign data asset and the client software of the authorized licence holders”.
In the time available, I will consider the Department of Health and Social Care. The report of the Sudlow review, Uniting the UK’s Health Data: A Huge Opportunity for Society, published last month, sets out what could be achieved though linking multiple NHS data sources. The Academy of Medical Sciences has fully endorsed the report:
“The Sudlow recommendations can make the UK’s health data a truly national asset, improving both patient care and driving economic development”.
There is little difference, if any, between health data being “a truly national asset” and “a sovereign asset”.
Generative AI has the potential to extract clinical value from linked datasets in the various secure data environments within the NHS and to deliver a step change in patient care. It also has the potential to deliver economic value, as the application of AI models to these rich, multimodal datasets will lead to innovative software products being developed for early diagnosis and personalised treatment.
However, it seems that the rush to generate economic value is preceding the establishment of a transparent licensing system, as in proposed new subsection (3), and the setting up of a coherent business model, as in proposed new subsection (6). As my noble friend Lady Kidron pointed out, the provisions in this amendment are urgently needed, especially as the chief data and analytics officer at NHS England is reported as having said, at a recent event organised by the Health Service Journal and IBM, that the national federated data platform will soon be used to train different types of AI model. The two models mentioned in the speech were OpenAI’s proprietary ChatGPT model and Google’s medical AI, which is based on its proprietary large language model, Gemini. So, the patient data in the national federated data platform being built by Palantir, which is a US company, is, in effect, being made available to fine-tune large language models pretrained by OpenAI and Google—two big US tech companies.
As a recent editorial in the British Medical Journal argued:
“This risks leaving the NHS vulnerable to exploitation by private technology companies whose offers to ‘assist’ with infrastructure development could result in loss of control over valuable public assets”.
It is vital for the health of the UK public sector that there is no loss of control resulting from premature agreements with big tech companies. These US companies seek privileged access to highly valuable assets which consist of personal data collected from UK citizens. The Government must, as a high priority, determine the rules for access to these sovereign data assets along the lines outlined in this amendment. I urge the Minister to take on board both the aims and the practicalities of this amendment before any damaging loss of control.
My Lords, before we move on to the next group, I again remind noble Lords that we have in fact only two groups to get through because Amendment 212 will not be moved. We have about 25 minutes to get through those two groups.
Amendment 211B
That concludes the Committee’s proceedings on the Bill. I thank all noble Lords who have participated for being so co-operative.