All 3 Lord Knight of Weymouth contributions to the Data (Use and Access) Bill [HL] 2024-26

Read Bill Ministerial Extracts

Tue 19th Nov 2024
Mon 16th Dec 2024
Wed 18th Dec 2024

Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Data (Use and Access) Bill [HL]

Lord Knight of Weymouth Excerpts
2nd reading
Tuesday 19th November 2024

(1 month ago)

Lords Chamber
Read Full debate Data (Use and Access) Bill [HL] 2024-26 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, I remind the House of my interests, particularly in chairing the board of Century-Tech, an AI edtech company— I will have to talk to the noble Baroness, Lady Kidron, about that. I am a director of Educate Ventures Research, which is also an AI-in-education business, and Goodnotes, an AI note-taking app. It is a pleasure to follow the noble Baroness, Lady Kidron. I agreed with most of what she said, and I look forward to working with her on the passage of the Bill.

I guess we are hoping it is third time lucky with a data Bill. I am sure we will hear from all speakers that there is a sense that this is an improved Bill on the previous two attempts. It is a particular joy to see that terrible stuff around DWP data not appearing in this Bill. There is plenty that I welcome in terms of the improvements. Like most speakers, I imagine, I mostly want to talk about what might need further debate and what might be missing, rather than just congratulating my noble friend the Minister on the improvements she and her colleagues have been able to make.

I anticipate that this will not be the only Bill we have on data and AI. It would be really helpful for this Government to rediscover the joys of a White Paper. If we had a document that set out the whole story and the vision, so that we could more easily place this Bill in context, that would be really helpful. This could include where we are with the Matt Clifford action plan, and a very clear aim of data adequacy with the EU regime. I wonder whether, among all the people the Minister said she had been able to talk to about this Bill, she had also spoken to the EU to make sure we are moving in the right direction with adequacy, which has to be resolved by the summer.

Clearly, this is a Bill about data. The Minister said that data is the DNA of modern life. It has achieved a new prominence with the rollout of generative AI, which has captured everyone’s imagination—or entered their nightmares, depending on how you think about it. The Communications and Digital Committee, which I am privileged to serve on, has been thinking about that in respect of the future of news, which we will publish a report on shortly, and of scaling our businesses here in the UK. It is clear that the core ingredients you need are computing power, talent, finance and, of course, data, in order successfully to grow AI businesses here.

I agree with the noble Baroness, Lady Kidron, that we have a unique asset in our public sector datasets that the US does not have to anything like the same extent—in particular in health, but also in culture and education. It is really important that the Government have a regime, established by this legislation and any other legislation we may or may not know about, to protect and deploy that data to the public benefit and not just the private benefit, be it in large language models or other foundational models of whatever size.

It is then also important to ask, whose data is it? In my capacity as chair of a board of an AI company, I am struck by the fact that our current financial regulation does not allow us to list our data as an asset on our balance sheet. I wonder when we might be able to move in that direction, because it is clearly of some significance to these sorts of businesses. But it is also true that the data I share as a citizen, and have given consent to, should be my data. I should have the opportunity to get it back quite easily and to decide who to share it with, and it should empower me as a citizen. I should be able to hold my own data, and I definitely should not have to pay twice for it: I should not have to pay once through my taxes and then a second time by having to pay for a product that has been generated by the data that I paid for the first time. So I am also attracted to what the noble Baroness said about data as a sovereign asset.

In the same way that both Front-Bench speakers were excited about the national underground asset register, I am equally excited about the smart data provisions in the Bill, particularly in respect of the National Health Service. Unfortunately, my family have been intensive users of the National Health Service over the past year or so, and the extent to which the various elements of our NHS do not talk to each other in terms of data is a tragedy that costs lives and that we urgently need to resolve. If, as a result of this Bill, we can take the glorious way in which I can share my banking data with various platforms in order to benefit myself, and do the same with health data, that would be a really good win for us as a nation. Can the Minister reassure me that the same could be true for education? The opportunity to build digital credentials in education by using the same sort of technology that we use in open banking would also excite me.

I ask the Minister also to think about and deliver on a review of Tell Us Once, which, when I was a Minister in the DWP a long time ago, I was very happy to work on. By using Tell Us Once, on the bereavement of a relative, for example, you have to tell only one part of the public sector and that information then cascades across. That relieves you of an awful lot of difficult admin at a time of bereavement. We need a review to see how this is working and whether we can improve it, and to look at a universal service priority register for people going through bereavement in order to prioritise services that need to pass the message on.

I am concerned that we should have cross-sector open data standards and alignment with international interoperability standards. There is a danger in the Bill that the data-sharing provisions are protected within sectors, and I wonder whether we need some kind of authority to drive that.

It is important to clarify that the phrase used in the first part of the Bill, a

“person of a specified description”,

can include government departments and public bodies so that, for example, we can use those powers for smart data and net-zero initiatives. Incidentally, how will the Government ensure that the supply chains of transformers, processors, computing power and energy are in place to support AI development? How will we publish the environmental impact of that energy use for AI?

There is a lot more I could say, but time marches on. I could talk about digital verification services, direct marketing and a data consent regime, but those are all things to explore in Committee. However, there are two other things that I would briefly like to say before winding up. First, I have spoken before in this House about the number of people who are hired, managed and fired by AI automated decision-making. I fear that, under the Bill as drafted, those people may get a general explanation of how the automated decision-making algorithms are working, when in those circumstances they need a much more personalised explanation of why they have been impacted in this way. What is it about you, your socioeconomic status and the profile that has caused the decision to go the way it has?

Secondly, I am very interested in the role of the Digital Regulation Cooperation Forum in preventing abuse and finding regulatory gaps. I wonder whether, after the perennial calls in this Chamber when debating Bills such as this for a permanent Committee of both Houses to monitor digital regulation, the new Government have a view on that. I know that that is a matter for the usual channels and not Ministers, but it is a really important thing for this House to move on. I am fairly bored with making the case over the past two or three years.

In summary, this is a good Bill but it is a long Bill, and there is lots to do. I wish the Minister good luck with it.

Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Data (Use and Access) Bill [HL]

Lord Knight of Weymouth Excerpts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.

Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.

As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.

When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.

It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.

That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.

Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.

In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.

Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.

That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Data (Use and Access) Bill [HL]

Lord Knight of Weymouth Excerpts
A code would help everybody.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I was unsure whether to support Amendment 141, let alone speak to it, simply because I have a number of interests in this area and I should be clear about those. I chair Century-Tech Ltd, which is an AI edtech company; I am on the board of Educate Ventures Research Ltd, which offers advice to educators and schools on the use of AI in education; and I am a trustee of the Good Future Foundation, which does something similar.

I start by reminding the Committee of some of the benefits of technology and AI for education, so that there is a balance both in my speech and in the debate. Exciting practice is already taking place in the area of flipped learning, for example, where—putting issues of the digital divide to one side—in those classes and communities where there is good access to technology at home, the instructional element of learning can take place at home and school becomes a much more profoundly human endeavour, with teachers being able to save the time spent on the instructional element of teaching to bring that learning to life. I have some issues with AI in the world of tutoring in certain circumstances, but some of that can be very helpful in respect of flipped learning.

Project-based learning also becomes much more possible. That is very hard to teach but much more possible to teach by using AI tools to help link what is being learned in projects through to the curriculum. Teacher time can be saved and, by taking care of a lot of administrative tasks through AI, we can in turn make a significant contribution to the teacher retention crisis that is currently bedevilling our schools. There are novel assessment methods that can now be developed using AI, in particular making the traditional assessment method of the viva much more affordable and reliable. It is hard to use AI to cheat if you are being assessed orally.

Finally, an important element is preparation for work: if we want these young people to be able to leave school and thrive in a labour market where they must be able to collaborate effectively with machines, we need them to be able to experience that in a responsible and taught fashion in school.

However, dystopian issues can arise from an over- dependence on technology and from some of the potential impacts of using AI in education, too. I mentioned the digital divide—the 7.5 million families in this country are not connected to and confident to use the internet—and we discovered during Covid the device and data poverty that exists in this country. There is a possibility that poorer kids end up being taught by machines and not by human teachers at all. There is a danger that we do not shift our schools away from the slightly Victorian system that we have at the moment, which the noble Baroness, Lady Kidron, referenced at Second Reading. If we do not, we will end up with our children being outcompeted by machines. That overreliance on AI could also end up as privatisation by stealth because, if all the AI, technology and data are held by the private sector, and we are dependent on it, we will be beholden to the private sector however much we believe in the importance of the public good in our schools.

There are also problems of system design; I mentioned the Victorian system. I am hopeful that the curriculum and assessment review and the Children’s Wellbeing and Schools Bill that was published this week will help us. Whichever direction that review and those reforms take, we can be confident that edtech will respond. That is what it does; it responds to whatever regulation we pass, including in this Bill, over time and to whatever changes take place in the education system.

But tech needs data and it needs diversity of data. There is a danger that, if we close off access to data in this country, we will all end up using lots of AI that has been developed by using Chinese data, where they do not have the same misgivings about privacy, sharing each other’s data and acquiring data. We have to find a regime that works.

I do a bunch of work in international schooling as chair of COBIS—the Council of British International Schools—and I know of one large international school group, which I do not advise, that has done a deal with Microsoft around sharing all its pupil data, so that it can be used for Copilot. Obviously, Microsoft has a considerable interest in OpenAI, and we do not know exactly where that data is going. That points to some of the concerns that the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, have talked about.

During Covid, schools were strongly encouraged by the then Government to use either Google Classroom or Microsoft 365. Essentially, everyone was given a binary choice, and lots of data was therefore captured by those two large American corporations, which assisted them to develop further products. Any British alternative was, in essence, cut out, so we have good reason to be concerned in this area. That is why in the end I added my name and support to Amendment 141 in the name of the noble Baroness, Lady Kidron.

Children need privacy and they need digital rights. At the moment, those are exercised through parental consent for the use of these platforms and the capture of data, but I think it would be helpful to put that in a codified form, so that all those concerns have some sense of security about the regimes around which this works.

Ever since the abolition of Becta back in 2010, school leaders have been missing advice. Becta advice was used around the globe, as it was the authority on what works in technology and education. Sadly, the coalition got rid of it, and school leaders are now operating kind of blindfolded. We have 25,000 different school leaders buying technology, and very few of them really know what they are doing when faced with slick salespeople. Giving them some protection with a code would help their procurement.

The proof of the pudding will of course be in the eating—in the detail of the code—but I urge my noble friend the Minister to reflect carefully on the need for this, to talk to the DfE about it and to try to get some agreement. The DfE itself does not have the greatest track record on data and data protection. It has got into trouble with the ICO on more than one occasion.

My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I shall speak very briefly. I have a great deal of agreement with what the noble Baroness, Lady Kidron, the noble Lord, Lord Russell, and my noble friend Lord Bethell have said. I am rising to nitpick; I apologise for that, but I suppose that is what Committee is for.

The final line of proposed new subsection (da), to be inserted by Amendment 198, refers to

“different characteristics including gender, race, ethnicity, disability, sexuality, gender”.

On our first day in Committee, I raised the importance of the issue of sex, which is different from gender or sexuality. We need to make sure that we get the wording of this amendment, if it were to be accepted by the Government, absolutely right.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I shall also speak extremely briefly, as one of the three veterans of the Joint Committee present in Committee today, to reinforce my support for these amendments. The Government should be congratulated on Clause 123. It is welcome to see this movement but we want to see this done quickly. We want to ensure that it is properly enforceable, that terms of service cannot be used to obstruct access to researchers, as the noble Lord, Lord Bethell, said, and that there is proper global access by researchers, because, of course, these are global tech companies and UK users need to be protected through transparency. It is notable that, in the government consultation on copyright and AI published yesterday, transparency is a core principle of what the Government are arguing for. It is this transparency that we need in this context, through independent researchers. I strongly commend these amendments to the Minister.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

--- Later in debate ---
The recent debate on deepfakes, led by my noble friend Lady Owen, gave a very clear sense of where the mood of the House is. Urgency is imperative—the technology is moving more quickly than our legislative response. I hope the Minister will realise that this is an opportunity to set a new milestone for legislative responses to a new technological threat and seize it. The explosion of computer-generated CSAM is a pressing threat to our society, so supporting the amendment is a vital step towards safeguarding thousands more from online abuse.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.