7 Lord Knight of Weymouth debates involving the Department for Business and Trade

Data (Use and Access) Bill [HL]

Lord Knight of Weymouth Excerpts
A code would help everybody.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I was unsure whether to support Amendment 141, let alone speak to it, simply because I have a number of interests in this area and I should be clear about those. I chair Century-Tech Ltd, which is an AI edtech company; I am on the board of Educate Ventures Research Ltd, which offers advice to educators and schools on the use of AI in education; and I am a trustee of the Good Future Foundation, which does something similar.

I start by reminding the Committee of some of the benefits of technology and AI for education, so that there is a balance both in my speech and in the debate. Exciting practice is already taking place in the area of flipped learning, for example, where—putting issues of the digital divide to one side—in those classes and communities where there is good access to technology at home, the instructional element of learning can take place at home and school becomes a much more profoundly human endeavour, with teachers being able to save the time spent on the instructional element of teaching to bring that learning to life. I have some issues with AI in the world of tutoring in certain circumstances, but some of that can be very helpful in respect of flipped learning.

Project-based learning also becomes much more possible. That is very hard to teach but much more possible to teach by using AI tools to help link what is being learned in projects through to the curriculum. Teacher time can be saved and, by taking care of a lot of administrative tasks through AI, we can in turn make a significant contribution to the teacher retention crisis that is currently bedevilling our schools. There are novel assessment methods that can now be developed using AI, in particular making the traditional assessment method of the viva much more affordable and reliable. It is hard to use AI to cheat if you are being assessed orally.

Finally, an important element is preparation for work: if we want these young people to be able to leave school and thrive in a labour market where they must be able to collaborate effectively with machines, we need them to be able to experience that in a responsible and taught fashion in school.

However, dystopian issues can arise from an over- dependence on technology and from some of the potential impacts of using AI in education, too. I mentioned the digital divide—the 7.5 million families in this country are not connected to and confident to use the internet—and we discovered during Covid the device and data poverty that exists in this country. There is a possibility that poorer kids end up being taught by machines and not by human teachers at all. There is a danger that we do not shift our schools away from the slightly Victorian system that we have at the moment, which the noble Baroness, Lady Kidron, referenced at Second Reading. If we do not, we will end up with our children being outcompeted by machines. That overreliance on AI could also end up as privatisation by stealth because, if all the AI, technology and data are held by the private sector, and we are dependent on it, we will be beholden to the private sector however much we believe in the importance of the public good in our schools.

There are also problems of system design; I mentioned the Victorian system. I am hopeful that the curriculum and assessment review and the Children’s Wellbeing and Schools Bill that was published this week will help us. Whichever direction that review and those reforms take, we can be confident that edtech will respond. That is what it does; it responds to whatever regulation we pass, including in this Bill, over time and to whatever changes take place in the education system.

But tech needs data and it needs diversity of data. There is a danger that, if we close off access to data in this country, we will all end up using lots of AI that has been developed by using Chinese data, where they do not have the same misgivings about privacy, sharing each other’s data and acquiring data. We have to find a regime that works.

I do a bunch of work in international schooling as chair of COBIS—the Council of British International Schools—and I know of one large international school group, which I do not advise, that has done a deal with Microsoft around sharing all its pupil data, so that it can be used for Copilot. Obviously, Microsoft has a considerable interest in OpenAI, and we do not know exactly where that data is going. That points to some of the concerns that the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, have talked about.

During Covid, schools were strongly encouraged by the then Government to use either Google Classroom or Microsoft 365. Essentially, everyone was given a binary choice, and lots of data was therefore captured by those two large American corporations, which assisted them to develop further products. Any British alternative was, in essence, cut out, so we have good reason to be concerned in this area. That is why in the end I added my name and support to Amendment 141 in the name of the noble Baroness, Lady Kidron.

Children need privacy and they need digital rights. At the moment, those are exercised through parental consent for the use of these platforms and the capture of data, but I think it would be helpful to put that in a codified form, so that all those concerns have some sense of security about the regimes around which this works.

Ever since the abolition of Becta back in 2010, school leaders have been missing advice. Becta advice was used around the globe, as it was the authority on what works in technology and education. Sadly, the coalition got rid of it, and school leaders are now operating kind of blindfolded. We have 25,000 different school leaders buying technology, and very few of them really know what they are doing when faced with slick salespeople. Giving them some protection with a code would help their procurement.

The proof of the pudding will of course be in the eating—in the detail of the code—but I urge my noble friend the Minister to reflect carefully on the need for this, to talk to the DfE about it and to try to get some agreement. The DfE itself does not have the greatest track record on data and data protection. It has got into trouble with the ICO on more than one occasion.

My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I shall speak very briefly. I have a great deal of agreement with what the noble Baroness, Lady Kidron, the noble Lord, Lord Russell, and my noble friend Lord Bethell have said. I am rising to nitpick; I apologise for that, but I suppose that is what Committee is for.

The final line of proposed new subsection (da), to be inserted by Amendment 198, refers to

“different characteristics including gender, race, ethnicity, disability, sexuality, gender”.

On our first day in Committee, I raised the importance of the issue of sex, which is different from gender or sexuality. We need to make sure that we get the wording of this amendment, if it were to be accepted by the Government, absolutely right.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I shall also speak extremely briefly, as one of the three veterans of the Joint Committee present in Committee today, to reinforce my support for these amendments. The Government should be congratulated on Clause 123. It is welcome to see this movement but we want to see this done quickly. We want to ensure that it is properly enforceable, that terms of service cannot be used to obstruct access to researchers, as the noble Lord, Lord Bethell, said, and that there is proper global access by researchers, because, of course, these are global tech companies and UK users need to be protected through transparency. It is notable that, in the government consultation on copyright and AI published yesterday, transparency is a core principle of what the Government are arguing for. It is this transparency that we need in this context, through independent researchers. I strongly commend these amendments to the Minister.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

--- Later in debate ---
The recent debate on deepfakes, led by my noble friend Lady Owen, gave a very clear sense of where the mood of the House is. Urgency is imperative—the technology is moving more quickly than our legislative response. I hope the Minister will realise that this is an opportunity to set a new milestone for legislative responses to a new technological threat and seize it. The explosion of computer-generated CSAM is a pressing threat to our society, so supporting the amendment is a vital step towards safeguarding thousands more from online abuse.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.

Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.

As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.

When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.

It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.

That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.

Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.

In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.

Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.

That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

Public Authority Algorithmic and Automated Decision-Making Systems Bill [HL]

Lord Knight of Weymouth Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I very much welcome this Bill. It is a bit like the previous Bill, in that it addresses an important set of issues, and I encourage my Front-Bench friends to find a way, if not through this Bill, to address them.

In many ways, this is a bit of a warm-up for the debate we will have on Monday on the Data (Use and Access) Bill, to which the noble Lord, Lord Clement-Jones, has tabled a number of amendments on the same sort of issues. Indeed, this Bill could even be using some of the same text as his amendments. So, it is a pleasure to be able to rehearse what I might want to say on Monday.

Automated decision-making by AI is an area where we are balancing efficiency and equity. There are some significant savings to be made in public efficiency and public money with the use of automated decision-making tools. However, we have to be conscious of the risks associated with algorithmic bias and the extensive use of ADMs, which DWP officials have noted in evidence sessions. The noble Lord, Lord Clement-Jones, reminded us of the A-level marking scandal in 2020—it was clearly unreasonable for individuals’ A-levels results to be changed because of the results of previous similar candidates but not the candidates actually taking the tests.

Two weeks ago, I read in my newspaper—online, obviously—that departments are not registering their use of AI systems, as they are mandatorily required to. Only three Cabinet Office ADMs have been registered since 2022. So, not only do we need to legislate in this area; we also need public authorities to stick to it.

The equity risk is higher in some areas than others. We have to pay particular attention where, for example, ADMs are applied to benefits—to the money people receive—to sentencing, which happens in some parts of the world, to immigration decisions and to employment. In addition, as the noble Lord said, they are likely to disproportionately affect the poorest.

Why has the noble Lord, Lord Clement-Jones, confined his Bill to public authorities? I am sympathetic to extending this to work settings generally, including in the private sector. We see people being hired, managed and fired by ADM. Not every Christmas present is delivered by Santa Claus, and logistics workers are working flat out at the moment, under zero-hours contracts, being managed by ADMs. We should give them some protection.

I look forward to the Minister’s response and to discussing this more on Monday, and I hope we can see some progress on this.

Data (Use and Access) Bill [HL]

Lord Knight of Weymouth Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, I remind the House of my interests, particularly in chairing the board of Century-Tech, an AI edtech company— I will have to talk to the noble Baroness, Lady Kidron, about that. I am a director of Educate Ventures Research, which is also an AI-in-education business, and Goodnotes, an AI note-taking app. It is a pleasure to follow the noble Baroness, Lady Kidron. I agreed with most of what she said, and I look forward to working with her on the passage of the Bill.

I guess we are hoping it is third time lucky with a data Bill. I am sure we will hear from all speakers that there is a sense that this is an improved Bill on the previous two attempts. It is a particular joy to see that terrible stuff around DWP data not appearing in this Bill. There is plenty that I welcome in terms of the improvements. Like most speakers, I imagine, I mostly want to talk about what might need further debate and what might be missing, rather than just congratulating my noble friend the Minister on the improvements she and her colleagues have been able to make.

I anticipate that this will not be the only Bill we have on data and AI. It would be really helpful for this Government to rediscover the joys of a White Paper. If we had a document that set out the whole story and the vision, so that we could more easily place this Bill in context, that would be really helpful. This could include where we are with the Matt Clifford action plan, and a very clear aim of data adequacy with the EU regime. I wonder whether, among all the people the Minister said she had been able to talk to about this Bill, she had also spoken to the EU to make sure we are moving in the right direction with adequacy, which has to be resolved by the summer.

Clearly, this is a Bill about data. The Minister said that data is the DNA of modern life. It has achieved a new prominence with the rollout of generative AI, which has captured everyone’s imagination—or entered their nightmares, depending on how you think about it. The Communications and Digital Committee, which I am privileged to serve on, has been thinking about that in respect of the future of news, which we will publish a report on shortly, and of scaling our businesses here in the UK. It is clear that the core ingredients you need are computing power, talent, finance and, of course, data, in order successfully to grow AI businesses here.

I agree with the noble Baroness, Lady Kidron, that we have a unique asset in our public sector datasets that the US does not have to anything like the same extent—in particular in health, but also in culture and education. It is really important that the Government have a regime, established by this legislation and any other legislation we may or may not know about, to protect and deploy that data to the public benefit and not just the private benefit, be it in large language models or other foundational models of whatever size.

It is then also important to ask, whose data is it? In my capacity as chair of a board of an AI company, I am struck by the fact that our current financial regulation does not allow us to list our data as an asset on our balance sheet. I wonder when we might be able to move in that direction, because it is clearly of some significance to these sorts of businesses. But it is also true that the data I share as a citizen, and have given consent to, should be my data. I should have the opportunity to get it back quite easily and to decide who to share it with, and it should empower me as a citizen. I should be able to hold my own data, and I definitely should not have to pay twice for it: I should not have to pay once through my taxes and then a second time by having to pay for a product that has been generated by the data that I paid for the first time. So I am also attracted to what the noble Baroness said about data as a sovereign asset.

In the same way that both Front-Bench speakers were excited about the national underground asset register, I am equally excited about the smart data provisions in the Bill, particularly in respect of the National Health Service. Unfortunately, my family have been intensive users of the National Health Service over the past year or so, and the extent to which the various elements of our NHS do not talk to each other in terms of data is a tragedy that costs lives and that we urgently need to resolve. If, as a result of this Bill, we can take the glorious way in which I can share my banking data with various platforms in order to benefit myself, and do the same with health data, that would be a really good win for us as a nation. Can the Minister reassure me that the same could be true for education? The opportunity to build digital credentials in education by using the same sort of technology that we use in open banking would also excite me.

I ask the Minister also to think about and deliver on a review of Tell Us Once, which, when I was a Minister in the DWP a long time ago, I was very happy to work on. By using Tell Us Once, on the bereavement of a relative, for example, you have to tell only one part of the public sector and that information then cascades across. That relieves you of an awful lot of difficult admin at a time of bereavement. We need a review to see how this is working and whether we can improve it, and to look at a universal service priority register for people going through bereavement in order to prioritise services that need to pass the message on.

I am concerned that we should have cross-sector open data standards and alignment with international interoperability standards. There is a danger in the Bill that the data-sharing provisions are protected within sectors, and I wonder whether we need some kind of authority to drive that.

It is important to clarify that the phrase used in the first part of the Bill, a

“person of a specified description”,

can include government departments and public bodies so that, for example, we can use those powers for smart data and net-zero initiatives. Incidentally, how will the Government ensure that the supply chains of transformers, processors, computing power and energy are in place to support AI development? How will we publish the environmental impact of that energy use for AI?

There is a lot more I could say, but time marches on. I could talk about digital verification services, direct marketing and a data consent regime, but those are all things to explore in Committee. However, there are two other things that I would briefly like to say before winding up. First, I have spoken before in this House about the number of people who are hired, managed and fired by AI automated decision-making. I fear that, under the Bill as drafted, those people may get a general explanation of how the automated decision-making algorithms are working, when in those circumstances they need a much more personalised explanation of why they have been impacted in this way. What is it about you, your socioeconomic status and the profile that has caused the decision to go the way it has?

Secondly, I am very interested in the role of the Digital Regulation Cooperation Forum in preventing abuse and finding regulatory gaps. I wonder whether, after the perennial calls in this Chamber when debating Bills such as this for a permanent Committee of both Houses to monitor digital regulation, the new Government have a view on that. I know that that is a matter for the usual channels and not Ministers, but it is a really important thing for this House to move on. I am fairly bored with making the case over the past two or three years.

In summary, this is a good Bill but it is a long Bill, and there is lots to do. I wish the Minister good luck with it.

AI Technology Regulations

Lord Knight of Weymouth Excerpts
Tuesday 30th July 2024

(4 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- View Speech - Hansard - -

To ask His Majesty’s Government what plans they have to regulate artificial intelligence technologies.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Baroness Jones of Whitchurch) (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as set out in the King’s Speech, we will establish legislation to ensure the safe development of AI models by introducing targeted requirements on a handful of companies developing the most powerful AI systems. The legislation will also place the AI Safety Institute on a statutory footing, providing it with a permanent remit to enhance the safety of AI. We will consult publicly on the details of the proposals before bringing forward legislation.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

I thank my noble friend the Minister for her reply and congratulate her on her appointment. There is no doubt that AI will be an important part of the economic growth that is this Government’s priority, but there are also growing concerns about the potential harms being caused by this technology, in particular around the creation of deepfake content to pervert the outcome of elections. What is the Government’s view on that potential harm to democracy, and are there any plans to extend the regulation to political advertising, as recommended in the 2020 report to this House from the Democracy and Digital Technologies Select Committee?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

I thank my noble friend for those good wishes. Of course, he is raising a really important issue of great concern to all of us. During the last election, we felt that the Government were well prepared to ensure the democratic integrity of our UK elections. We did have robust systems in place to protect against interference, through the Defending Democracy Taskforce and the Joint Election and Security Preparedness unit. We continue to work with the Home Office and the security services to assess the impact of that work. Going forward, the Online Safety Act goes further by putting new requirements on social media platforms to swiftly remove illegal misinformation and disinformation, including where it is AI-generated, as soon as it becomes available. We are still assessing the need for further legislation in the light of the latest intelligence, but I assure my noble friend that we take this issue extremely seriously. It affects the future of our democratic process, which I know is vital to all of us.

Digital Markets, Competition and Consumers Bill

Lord Knight of Weymouth Excerpts
Moved by
73A: After Clause 109, insert the following new Clause—
“CMA cooperation with work and labour market institutions(1) The CMA must take reasonable steps to consult with—(a) the Office of Labour Market Enforcement,(b) the Health and Safety Executive,(c) the Employment Agency Standards Directorate, and(d) HMRC where the CMA considers that the institution holds or has a right to request information, knowledge or documentation that may be relevant to the exercise of its regulatory functions.(2) The CMA must, following consultation under subsection (1), undertake a regulatory function analysis and make recommendations regarding the following additional questions—(a) whether action should be taken by the institution or others to materially affect competition in line with the CMA’s objectives, and(b) if so, what action should be taken.(3) The institutions named in subsection (1) may make a recommendation or other requests to the CMA where they consider that the CMA may exercise a regulatory function.(4) A recommendation or other request under subsection (3) must be accompanied by a statement of reasons which sets out the rationale and any substantial legal or evidential questions identified by the institution for consideration by the CMA.(5) In this section, a material effect on competition is deemed to include a significant impact on the creation, displacement or alteration in the conditions or quality of work and environment for work in the United Kingdom.”Member’s explanatory statement
This amendment enables cooperation between the CMA and work and labour market regulators.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, this group contains a range of amendments on competition reforms. They are fairly wide-ranging and I will leave it to the proposers of the other amendments to summarise them.

Amendment 73A, in my name and the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, returns us to an issue that we debated last Monday and on which the noble Lord, Lord Clement-Jones, moved an amendment. It is the issue of good work and the CMA. I apologise for returning to the issue, but that was stimulated by the Minister, the noble Viscount, Lord Camrose, saying that

“the CMA may identify actions that other regulators or public bodies would be better placed to act upon. This may include the DMU referring issues such as workplace conditions to a relevant regulator”.—[Official Report, 22/1/24; col. GC 132.]

I reflected on it and thought that there may be some merit in seeing whether or not we can empower it in the Bill. Subsequent reading and events have reinforced that view. The purpose of these amendments is to promote cross regulator co-operation and information sharing.

Our current approach to regulation rests on domain or sector-specific action, which demands a high level of co-operation and co-ordination. This means sharing information and knowledge, as well as technical and non-technical skills and resources, exactly as was publicly requested by the director of labour market enforcement, Margaret Beels, in her letter of April 2023, to the BEIS Committee in the other place, on AI and the labour market. I remind the Committee that the director of labour market enforcement is effectively an arm’s-length body of the Minister’s department. Her letter said:

“There is a need for cross-cutting collaboration with regulation in this space … There is no vehicle or champion for doing this”


at present.

--- Later in debate ---
Lord Offord of Garvel Portrait Lord Offord of Garvel (Con)
- Hansard - - - Excerpts

I need to write to the noble Lord on that.

I now speak briefly to the government amendments in this group, all of which are minor and technical in nature. First, Amendments 90, 91 and 92 ensure that extensions to the statutory deadlines for phase 2 merger investigations under the new fast track procedure for mergers operate correctly within the existing legal framework for deadline extensions under the Enterprise Act 2002.

Secondly, government Amendments 94, 95, 97, 98, 99, 100 and 102, will clarify that, in the civil penalty provisions introduced and amended by Schedules 9 and 10 to the Bill, references to maximum amounts of daily penalties are maximums per day and not in total.

Thirdly, Amendments 96 and 101 update cross-references in Section 120 of the Enterprise Act 2002, so that decisions made under the civil penalty provisions in Part 3 of that Act, as amended by the Bill, are carved out from that provision. Section 120 allows persons to seek a review of a CMA decision in the CAT on judicial review principles. Such a review is not required because penalty decisions are appealable on a merits basis.

Fourthly, Amendment 103 makes the equivalent amendment to Section 179 in relation to civil penalty decisions made under Part 4 of the Enterprise Act.

Finally, Amendments 104 and 105 have been introduced to take account of an amendment made by the Energy Act 2023 to Section 124(5) of the Enterprise Act 2002, which is also amended by the Bill.

I hope noble Lords will support these government amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, we have had a useful debate. I was very much persuaded by the noble Lord, Lord Tyrie—far more so than the Minister was—and I thought that the noble Lord, Lord Clement-Jones, made some useful points around asymmetry in respect of search and media.

I am very grateful to all noble Lords who responded to my amendments. I kind of feel that my friend, the noble Baroness, Lady Harding, and the noble Lord, Lord Ranger, were in many ways responding to last week’s debate—I think as the noble Baroness admitted. It is perfectly possible to argue that it is an encumbrance to extend the remit as we were arguing last week; that is a perfectly reasonable position. Indeed, just yesterday in the Observer, I read Torsten Bell from the Resolution Foundation responding to the CMA chief executive’s speech around the labour market and competition, saying that this is not a case for minimum labour standards nor a case for extending regulatory reach. They have friends in all sorts of places.

The EU announced a fine of £27 million against Amazon for oversurveillance of workers. These are real problems, and there is a regulatory gap that would be best addressed, I am sure, by having a single powerful labour market regulator. At the moment, we have a multiplicity of relatively weak regulators. That might solve some of the regulatory gap problem.

The debate this week was much more about collaboration between regulators. I feel that the Minister failed to really address and respond to the point. He might want to follow up by having a meeting just to sort out whether, in essence, Margaret Beels, the director of labour market enforcement, is wrong. In her letter to the BEIS Select Committee on 6 April 2023, under the bullet point on regulation, she said that:

“There is a need for cross-cutting collaboration with regulation in this space to bring different aspects together both within the UK and across the international playing field. There is also a need to learn from each other. There is no vehicle or champion for doing this”.


If the Minister had been listening, I said that earlier. He performs his notes brilliantly, but one of these regulators is saying that there is “no vehicle or champion” for regulatory co-operation in respect of AI. We need to fill that regulatory gap, and this Bill is an opportunity for us to do so. It is urgent because of the exploitation of some workers. We need to get on with it and I hope that, as this Bill proceeds, we find an opportunity to do so. I would be delighted to do so in collaboration and co-operation with the Government Front Bench.

On that basis, I beg leave to withdraw my amendment.

Amendment 73A withdrawn.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, it is a great pleasure, as ever and once more, to follow the noble Baroness, Lady Stowell. I particularly endorse the comment she made about having a Joint Committee, which I also made repeatedly during the course of the Online Safety Act. I am pleased to note the precedent she noticed, which I did not, and I support what she had to say. I remind your Lordships of my interests in the register, particularly as the chair of tech company CENTURY Tech and a co-owner of Suklaa Limited, which has a number of tech clients.

Like all other speakers so far, I very much welcome the Bill but, like everybody else, I think except for the Minister, I question whether it goes far enough in creating a sufficiently robust regime to hold the large tech companies to account. I do not necessarily want to bash them, but it is notable that they are particularly wealthy and particularly litigious. If we want to have a meaningful regime, we need a robust set of regulators to take them on. In September, the European Commission listed six of them—Alphabet, Amazon, Apple, Meta, Microsoft and ByteDance—as the gatekeepers under its new Digital Markets Act. That feels like roughly a good list of companies for us to keep in mind.

I was amused to look back, just over 20 years ago, to the anti-trust case taken against Microsoft. At that time, Microsoft was the gatekeeper as everyone was using personal computers to access the internet, and the likes of Apple were pushing for the competition authority in the US, the Federal Trade Commission, to take action, so that it could free up browsers and operating systems to allow consumers to access the internet through other sources. Happily, that pressure won out, and Microsoft had to yield and lost the anti-trust case. It is now time for us to take action, in particular on the issue of app stores. I am delighted that the noble Baroness, Lady Harding, is in her place, because she and I collaborated a little, and she led, on trying to get app stores included within the competence of the Online Safety Act. There is no doubt that we are now all accessing the internet predominantly on phones and iPads. The latest data that I have seen from Statista for this country says that, in the UK, 60% of us use smartphones as the most important device to access the internet, and another 12% use tablets such as iPads. That is 72% of us going through either the Apple App Store or the Android store to access the applications that we need to access content.

How do those app stores work? If you want to collect money through them, they take a percentage of that money—roughly 30%. That is on top of VAT at 20%, assuming you are liable to pay VAT, so you have lost 50% of your revenue before you have even started. That is a massive constraint on small businesses being able to get set up. We see that Spotify—one of the companies which have tried to come to talk to me—has, as I read in the newspaper, cut 1,500 jobs today. Perhaps if it was able to keep some more of its revenue and not have it taken by one or the other of those two platforms, some of those jobs would not be lost.

But it is about more than the money: it is also about the data that those two companies can collect through their app stores and analyse to see what applications, and what features within those, are doing well. Then, if they choose to, they can create competitor applications or block applications that they are concerned about. They will not block them overtly: they will just delay the process of approval through their systems—lo and behold, another release of iOS or other operating system is published, and the apps go to the back of the queue in the test pilot system before they can get approval to get on to the app store. All that is a massive constraint on small businesses being able to access and enter the market. I was struck by the speech by the noble Baroness, Lady Hayman, on planned obsolescence—that use of the release of the operating systems to make our devices obsolete is something that a powerful regulator could really help with, in ensuring that our devices remain current.

We need to act urgently in this country, and we need to be able to act internationally as well. Does the Minister honestly believe we have enough powers in the Bill for us to take on the really tricksy issue of these app stores? Will we be able to force them to offer alternative payment systems, so they do not cream off all the money, or systems so that, if I wanted to download an application on my iPhone, I would not have to go through the app store if I did not want to, so that we could then open up to more competition?

I move on to the issue of data a little more. In this House, I have previously raised my concern that an individual such as Elon Musk has all that data on transport movements through Tesla, on communications through his satellite company and on sentiment through his ownership of the company that used to be known as Twitter. That is just one example of a consolidation through horizontal integration, if you like, of data ownership. He, or others in similarly powerful positions, can point the same artificial intelligence machine at each of those individual data lakes, even if they are kept discrete, and get the benefit of being able to train the AI on the different sources of data and create power that nobody else has access to. That would give him a massive competitive advantage.

But it is bigger than just Musk: if you look at the amount of data that Google is collecting about us all at any given time, with all the integration that it has —or any one of the six tech giants that I listed earlier —it is a massive issue. Again, the CMA needs to have some ability to go after this data ownership issue, which is not about verticals but horizontals. I am not sure that it is within the regime or the thinking at the moment, and I would love to hear the Minister’s reassurance on it.

Like the noble Viscount, Lord Colville, I have concern around the competitive landscape for digital advertising. In the second quarter of 2022, Meta and Google made up 87.3% of total ad spend in the UK. It has fallen slightly since, with a greater share being invested in mobile-first platforms such as Snapchat and TikTok. This is in the context of online advertising spending making up 25% of total ad spend in this country. The DCMS has reviewed it and said that there is a lack of transparency and a need for action. However, at the end of its report, the DCMS says:

“In order to be ready to bring forward legislation to implement these reforms when Parliamentary time allows, we will be issuing a further consultation seeking views on these proposals”.


We have a vehicle here in the Bill. Why are we not taking action now to open up competition in digital advertising? Why are we waiting for parliamentary time when we have time now? Where is the sense of urgency from the Government around this important issue that the noble Viscount referred to?

Like others, I have looked at the correspondence on gift aid and would support action to be taken on it.

I know that the noble Baroness, Lady Kidron, who will be speaking later, has also raised the important issue of researcher access, which we came to in the Online Safety Act. Again, if we could use this vehicle to open up researcher access via the regulator to these large companies, then we could have some oversight over what is going on, so that we could inform better parliamentary scrutiny and regulation of these large, powerful and litigious organisations.

In the end, this is about the power of the internet for good and for ill. As we have heard, we have a suite of legislation before us, of which this is just one Bill, in order to create, hopefully, powerful, agile regulators who can collaborate and give confidence and safety for consumers to realise the transformational potential of technology and not the harms that we are all concerned about.