Artificial Intelligence: Regulation

Viscount Colville of Culross Excerpts
Monday 10th February 2025

(1 week, 5 days ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

As the noble Lord points out, getting regulation right here is good for investment and good for business. We are taking the approach of regulation by the existing regulators for the use of AI. We intend to bring forward legislation which allows us to safely realise the enormous benefits of AI in the frontier space. Of course, in the Data (Use and Access) Bill, some of the issues the noble Lord raised are already addressed.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, last week, the Startup Coalition of AI companies told a House of Commons Joint Committee that the Government should support a full commercial text and data mining model for AI training which would get rid of all copyright licensing for commercial AI training in the UK. Does the Minister support this suggestion?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

As I think I have made clear on several occasions at this Dispatch Box, we do not support that position. We believe that there needs to be control for creators; we need much better transparency in the system, and there needs to be access to use those images for AI. Those three things go hand in hand.

Data (Use and Access) Bill [HL]

Viscount Colville of Culross Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble, Lord Black, for daring to respond to the wonderful speech that opened the debate; I thought I might come in immediately afterwards, but I was terrified by it, so I decided that I would shelter on these Benches and gather my strength before I could begin to respond.

I feel that I have to speak because I am a member of the governing party, which is against these amendments. However, I have signed up to them because I have interests in the media—which I declare; I suppose I should also declare that I have a minor copyright, but that is very small compared with the ones we have already heard about—and because I feel very strongly that we will get ourselves into even more trouble unless action is taken quickly. I have a very clear view of the Government’s proposals, thanks to a meeting with my noble friend the Minister yesterday, where he went through, in detail, some of the issues and revealed some of the thinking behind them; I hope that he will come back to the points he made to me when he comes to respond.

There is no doubt that the use of a copyright work without the consent of the copyright owner in the United Kingdom is an infringement, unless it is “fair dealing” under UK copyright law. However, because of the developments in technology—the crawlers, scrapers and GAI that we have been hearing about—there is a new usage of a huge number of copyright works for the training of algorithms. That has raised questions about whether, and if so how, such usage has to be legislated for as “fair dealing”—if it is to be so—or in some other way, if there is indeed one.

It is right, therefore, for the Government to have required the IPO to carry out a consultation on copyright and AI, which we have been talking about. However, given the alarm and concern evident in the creative sector, we certainly regret the delay in bringing forward this consultation and we are very concerned about its limited scope. Looking at it from a long way away, it seems that this is as much a competition issue as it is a copyright issue. It seems to me and to many others, as we have heard, that the IPO, by including in the consultation document a proposed approach described as an “exception with rights reservation”, has made a very substantial mistake.

This may just be a straw-person device designed to generate more responses, but, if so, it was a bad misjudgement. Does it not make the whole consultation exercise completely wasteful and completely pointless to respond to? When my noble friend the Minister comes to respond, I hope that he, notwithstanding that proposed approach, will confirm that, as far as the Government are concerned, this is a genuine consultation and that all the possible options outlined by the IPO—and any other solutions brought forward during the consultation—will be properly considered on their merits and in the light of the responses to the consultation.

What the creative industries are telling us—they have been united and vehement about this issue, as has already been described, in a way that I have never seen before—is that they must have transparency about what material is being scraped, the right to opt in to the TDMs taking place and a proper licensing system with fair remuneration for the copyright material used. The question of whether the GAI developers should be allowed to use copyright content, with or without the permission of the copyright owner, is a nuanced one, as a decision either way will have very wide-ranging ramifications. However, as we have heard, this issue is already affecting the livelihood of our creative sector—the one that, also as we have heard, we desperately need if we are to support a sustainable creative economy and provide the unbiased information, quality education and British-based entertainment that we all value and want to see flourish.

We understand the need to ensure that the companies that want access to high-quality data and copyright material to train their AI models respect, and will be happy to abide by, any new copyright or competition regulations that may be required. However, the proposals we have heard about today—the ones that would come from the consultation, if we have to delay—will probably be very similar to the amendments before the House, which are modest and fair. We should surely not want to work with companies that will not abide by such simple requirements.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I support Amendments 44A and the consequential amendments in this group in the name of my noble friend Lady Kidron, whose speech has, I think, moved the whole Committee across all Benches.

Data (Use and Access) Bill [HL]

Viscount Colville of Culross Excerpts
Moved by
14: Clause 67, page 75, line 10, after “scientific” insert “and that is conducted in the public interest”
Member’s explanatory statement
This amendment ensures that to qualify for the scientific research exception for data reuse, that research must be in the public interest. This requirement already exists for medical research, but this amendment would apply it to all scientific research wishing to take advantage of the exception.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.

I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.

The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.

Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.

It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.

The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.

Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta

“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.

However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.

I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that

“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.

It continues:

“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.


The guidance even includes further examples of research in the public interest, such as

“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.

This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.

This view is supported in the EU, where

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”

The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references

“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.

Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.

It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.

In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.

There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.

--- Later in debate ---
I hope the noble Viscount is content to withdraw this amendment, given these reassurances and the concerns about a significant unintended consequence from going down this route.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I am grateful and impressed that the Minister has stepped into this controversial sphere of data management at such short notice. I wish his colleague, the noble Baroness, Lady Jones, a swift recovery.

I hope that noble Lords listened to the persuasive speeches that were given across the Benches, particularly from my noble friend Lady Kidron, with her warning about blurring the definition of scientific research. I am also grateful to the Opposition Benches for their support. I am glad that the noble Lord, Lord Markham, thinks that I am threading the needle between research and public trust.

I listened very carefully to the Minister’s response and understand that he is concerned by the heavy burden that this amendment would put on scientific research. I have listened to his explanation of the OECD Frascati principles, which define scientific research. I understand his concern that the rigorous task of demanding that new researchers have to pass a public interest test will stop many from going ahead with research. However, I repeat what I said in my opening speech: there has to be a balance between generating an AI revolution in this country and bringing the trust of the British people along with it. The public interest test is already available for restricted research in this field; I am simply asking for it to be extended to all scientific research.

I am glad that the reasonableness and lawfulness tests are built into Clause 67, but I ask for a test that I am sure most people would support—that the research should have a positive public benefit. On that note, I would like to seek the opinion of the House.

Data (Use and Access) Bill [HL]

Viscount Colville of Culross Excerpts
Baroness Freeman of Steventon Portrait Baroness Freeman of Steventon (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).

Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.

In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.

In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.

The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I have added my name to Amendment 36 tabled by the noble Lord, Lord Clement-Jones. I also support Amendments 26, 27, 28, 31, 32 and 35. The Government, in their AI Statement last week, said that ADM will be rolled out across the public sector in the coming months and years. It will increase productivity and provide better public services to the people of this country.

However, there are many people who are fearful of their details being taken by an advanced computer, and a decision which could affect their lives being made by that computer. Surely the days of “computer says no” must be over. People need to know that there is a possibility of a human being involved in the process, particularly when dealing with the public sector. I am afraid that my own interactions with public sector software in various government departments have not always been happy ones, and I have been grateful to be able to appeal to a human.

Online Safety

Viscount Colville of Culross Excerpts
Thursday 16th January 2025

(1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord for his question. Category 1, in the way that the Bill was ultimately approved, was for large sites with many users. The possibility remains that this threshold can be amended. It is worth remembering that category 1 imposes two additional duties: a duty that the company must apply its service agreements properly and a duty that users can make it possible for themselves not to see certain things. For many of the small and harmful sites, those things would not apply anyway, because users have gone there deliberately to see what is there, but the full force of the Act applies to those small companies, which is why there is a special task force to make sure that that is applied properly.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, Ofcom’s illegal harms code states that it has removed some of the code’s measures from smaller sites, due to evidence that they were not proportionate, but it is not clear which measures have been removed and why. Can the Minister provide further detail on which small sites are impacted and what measures they will not be required to follow?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My understanding of this is that the Online Safety Act applies to all small companies and nobody is exempt. The things that would not apply would be the specific things in category 1, or indeed in category 2A and 2B, which are to do with the ability to apply and monitor a service contract, and the ability to ensure that users can exempt themselves from seeing certain activities. Those would not apply, but everything else does apply, including all the force of the Act in terms of the application to illegal content and the priority harms that have been identified.

Artificial Intelligence Opportunities Action Plan

Viscount Colville of Culross Excerpts
Thursday 16th January 2025

(1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness for her input to date and on the important copyright issue. The question of market dominance is important. It is worth reflecting that Matt Clifford is an entrepreneur who deals with start-ups; the report is very strong on start-ups and what needs to be done to make sure that they are part of this, including what regulatory change needs to take place to encourage start-ups to do this. At the moment, it is quite difficult for them to navigate the system, including procurement. Government procurement is notoriously difficult for start-ups, and many of the specific aims of the plan pull that together to allow start-ups to access government procurement plans.

So there are very clear ambitions here to make this about growing an ecosystem of companies in this country, while recognising that many of the existing major companies, with which we will also have to work, are not here. Driving this forward will be a key task for DSIT right the way across government. It will need all-of-government activity, as outlined in the report.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, the Minister talked about the national data library, which is very welcome, but data in the library needs to be safe and its use carefully thought through. What role does the Minister think public interest thresholds should play in deciding what data is collected and how it should be used?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

Noble Lords will hear much more about the national data library over the coming months, but it is important to recognise that data is valuable only if it is collected well, curated properly and is interoperable and accessible. We need to ensure that it is properly protected, both for individual privacy, which is the point the noble Lord raises, and to make sure that we get the appropriate valuation of the data and that that value flows back into the UK and into public services. These will all be key features of the national data library.

Artificial Intelligence: Regulation

Viscount Colville of Culross Excerpts
Thursday 17th October 2024

(4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, this Government have pledged to recalibrate trade relations with the EU. However, the new EU AI legislation is much more prescriptive than the regulation proposed by the Government. How will the Government ensure that UK-based AI organisations with operations in the EU, or which deploy AI into the EU, will be aligned with EU regulation?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

As the noble Viscount points out, the EU regulation has gone in a somewhat different direction in taking a very broad approach and not a sector-specific approach. In contrast, the US looks as though it is going down a similar sort of route to the one that we are taking. Of course, there will be a need for interoperability globally, because this is a global technology. I think that there will be consultation and interactions with both those domains as we consider the introduction of the AI Act, and we are starting an extensive consultation process over the next few months.

King’s Speech (4th Day)

Viscount Colville of Culross Excerpts
Monday 22nd July 2024

(7 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, as many other noble Lords have said, artificial intelligence will revolutionise our economy and our society during the next decade. It will radically improve our productivity, research capability and delivery of public services, to name but a few, so I am pleased that the digital information and smart data Bill will enable innovative uses of data to be safely developed and deployed.

I hope that this Bill will begin to address the wider risks AI poses to us all unless it is developed and released safely. This Government need to ensure that AI develops to support our economy and society, and that it does not take society in dangerous and unintended directions. At all stages of the training and deployment of AI, there are economic and social risks. There are dangers the whole way through the supply chain, from the initial data ingestion of the massive datasets needed to set up these foundation models to their training and deployment, which I hope will begin to be addressed by the Bill.

My concern is that there can be differences in the inputting and modification of AI models that humans do not consider significant, but which could have major and possibly adverse effects on the behaviour of AI systems. It is essential that formal verification techniques are guaranteed throughout the whole process to prove their safety at all stages of the process.

However, the massive costs of training and developing these models, which can run into billions of pounds, have put huge pressure on the tech companies to monetise them and to do so quickly. This has led to rapid developments of systems, but underinvestment in safety measures. Many of us were impressed when at the Bletchley summit last year the previous Government obtained voluntary guarantees from the big AI developers to open up their training data and allow the latest generative AI models to be reviewed by the AI Safety Institute, allowing third-party experts to assess the safety of models. However, since then the commitment has not been adhered to. I am told that three out of four of the major foundation model developers have failed to provide pre-release access for their latest frontier models to the AI Safety Institute.

The tech companies are now questioning whether they need to delay the release of their new models to await the outcome of the institute’s safety tests. In a hugely competitive commercial environment, it is not surprising that the companies want to deploy them as soon as possible. I welcome the Government’s commitment during the election campaign to ensure that there will be binding regulation on big developers to ensure safe development of their models. I look forward to the Secretary of State standing by his promise to put on a statutory footing the release of the safety data from new frontier models.

However, these safety measures will take great expertise to enforce. The Government must give regulators the resources they need to ensure that they are effective. If the Government are to follow through with AI safety oversight by sectoral regulators, I look forward to the setting up of the new regulatory innovation office, which will both oversee where powers overlap and pinpoint lacunae in the regulation. However, I would like to hear from the Minister the extent of the powers of this new office.

I hope that at next year’s French summit on AI the Government will be at the centre of the development of standards of safety and will push for the closest international collaboration. There needs to be joint evaluations of safety and international co-operation of the widest kind—otherwise, developers will just go jurisdiction shopping—so the Government need not just to work closely with the US Artificial Intelligence Safety Institute and the new EU AI regulator but to ensure transparency. The best way to do this is to involve multi-stakeholder international organisations, such as the ISO and the UN-run ITU, in the process. It might be slower, but it will give vital coherence to the international agreement for the development of AI safety.

I am glad to hear the Minister say that the Government will lead a drive to make this country the centre of the AI revolution. It is also good that DSIT will be expanded to bring in an incubator for AI, along with the strategy for digital infrastructure development. I hope that this will be combined with support for the creative industries, which generated £126 billion of revenue last year and grew by 6%, an amazing performance when we look at the more sluggish performance of much of the rest of the economy. I hope that members of the creative industries will be on the Government’s new industrial strategy council and that the government-backed infrastructure bank will look not just at tangible assets but at the less tangible assets that need supporting and encouraging across the creative industries.

To bring together AI and the creative industries, the Government need to develop a comprehensive IP regime for datasets used to train AI models, as the noble Lord, Lord Holmes, just told us. There has been much in the press about the use of data without the creator’s consent, let along remuneration. I hope that DSIT and DCMS will come together to generate an IP regime that will have data transparency, a consent regime and the remuneration of creators at its heart.

I hope that the gracious Speech will lead us into an exciting new digital area where Britain is a leader in the safe, transparent development and rollout of a digital revolution across the world.

Deepfakes: General Election

Viscount Colville of Culross Excerpts
Wednesday 8th May 2024

(9 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

First, let me absolutely endorse the noble Lord’s sentiment: this is a deplorable way to behave that should not be tolerated. From hearing the noble Lord speak of the actions, my assumption is that they would fall foul of the false communications offence under Section 179 of the Online Safety Act. As I say, these actions are absolutely unacceptable.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, noble Lords will be aware of the threat of AI-generated deepfake election messages flooding the internet during an election campaign. At the moment, only registered users have to put a digital imprint giving the provenance of the content on unpaid election material. Does the Minister think that a requirement to put a digital imprint on all unpaid election material should be introduced to counter fake election messages?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

The noble Viscount is right to point to the digital imprint regime as one of the tools at our disposal for limiting the use of deepfakes. I think we would hesitate to have a blanket law that all materials of any kind would be required to have a digital imprint on them—but, needless to say, we will take away the idea and consider it further.

I look forward to noble Lords’ contributions to this debate. There is not one answer to the questions we are posing, but I beg to move the amendment.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I have put my name to Amendment 48 in the name of the noble Baroness, Lady Jones of Whitchurch, and I support dropping Clause 29 from the Bill.

These amendments are also about speeding up the process of stopping anti-competitive behaviour by the tech companies. It is essential that no hostages to fortune are given for tech company lawyers to drag out the process, as many noble Lords said, particularly in the first group.

I want noble Lords to bear in mind that, for every big tech company, every week they succeed in delaying a decision against their anti-competitive practices is one in which they earn millions of pounds, while their competitors are left struggling in so many areas. Speed is of the essence.

As a former newspaper journalist, my most immediate field of concern is local and regional media, which are suffering from the anti-competitive behaviour of the tech companies. There has been a collapse in local newspapers over the past decade and in the next three years this will turn in to a major exodus, with huge areas of the country becoming local news deserts with nobody reporting on local councils, courts and other important civic activities.

The Digital Markets Taskforce study on digital advertising found that the tech companies had used network effects and economies of scale to dominate the market. It concluded that the “more vibrant competition” in the market would improve

“the bargaining power of online news publishers”,

which would

“improve the health and sustainability of journalism in the UK”.

In turn, this would

“contribute positively to the effectiveness and integrity of our democracy”.

On top of this, much of the news content generated by these media companies is used by tech platforms either for free or for little remuneration.

I have long campaigned for the final offer mechanism to be available to the CMA as a powerful deterrent against anti-competitive behaviour by the tech companies, but surely all deterrents are more effective if there is a realistic chance that they will be deployed, and in a short time. Once the CR requirements on an SMS have been imposed, breached and reported, the CMA should be in a good position to know whether the designated SMS company will take the long or short road to a solution. Amendment 48 would allow the CMA to issue an enforcement order, decide whether that has been breached and investigate the breach, if it feels that it will lead to a satisfactory resolution to the company’s behaviour. However, if, earlier in the process, the solution is not going to be possible, the regulator needs the power to bring forward its ultimate deterrent. No SMS will want to have the final offer mechanism imposed on it, and I understand that the CMA is equally reluctant to deploy it, but the more pressing the threat the more likely it is that the DMU investigation will be brought to a quick and effective resolution.

I know that these companies will fight tooth and nail to preserve their massive profits resulting from the anti-competitive behaviours. It might be useful for the Committee if I give just one really shocking example of how effective these delaying actions can be. The salutary lesson is the story of a nascent shopping comparison site, Foundem, based in London and founded in 2005, which was doing very well until 2008, when it was massively deprioritised on Google Search, at about the same time that Google Shopping, the search engine’s own shopping comparison site, was set up. Foundem issued a complaint to the EU Commission in 2009 about anti-competitive behaviour by Google. The Commission set up an investigation and, three years later, after many legal arguments, Google was given a preliminary assessment—similar, I imagine, to an SMS designation. Rules were then laid down for the company to follow, but within six months market tests revealed that it was not tackling the anti-competitive behaviour. The response was dragged out by Google until 2016, when it was given a supplementary statement of objectives, which were also heavily fought by the search engine.

Finally, on 27 June 2017, the EU imposed a record €2.4 billion fine on Google for violating EU competition law. However, the company appealed, first to the EU General Court and then to the Court of Justice of the European Union. Final judgment on the case has yet to be issued. Meanwhile, Foundem exists in order to fight the case, but it suspended all its services eight years ago. This is a 15-year David-versus-Goliath battle with a company, some of whose activities CMA might have to designate. This legislation must be drafted to ensure that the process brings results, and fast, if small digital competitors are to have a chance of surviving.

Already the CMA estimates that the designation process will not become operational until June 2025. I know that the hope is to set up a designation process at the same time as negotiating the conduct requirements, but that could still take up to nine months to implement on the SMSs. Meanwhile, many of the smaller media outlets I talked about earlier will have gone under.

The same arguments for legal delay by tech companies must apply to Clause 29, which introduces the concept of countervailing benefits. I do not understand the need for Clause 29. Clearly, the balance between consumer benefit and anti-competitive behaviour will have been looked at as part of the SMS designation process, which is clearly set out in the Bill. Does the Minister think that our world-class regulator will ignore these considerations in the initial process? If they will be considered then, why introduce this clause for consideration all over again? I have already explained the need for speed in the CMA’s process. This exemption can only play into the hands of the tech companies to draw out the processes and hold up the prospect of many more companies like the start-up shopping search website Foundem being littered by the digital wayside. I ask the Government to seriously consider taking Clause 29 out of the Bill.

However, I support the fallback in Amendment 40, to have the word “indispensable” inserted into the clause. Your Lordships’ Committee has heard that “indispensable” was taken out on Report in the other place. The Minister has said that the simple threshold of “benefit” is already established in Section 9 of the Competition Act 1998 and Section 134(7) of the Enterprise Act. However, the former talks of an “indispensable benefit” and the latter just of a “benefit”. The Minister says that the two thresholds are the same; clearly, they are not.

The new definition of the grounds on which anti-competitive conduct can be permitted states that

“those benefits could not be realised without the conduct”.

It requires only that anti-competitive conduct be necessary, rather than indispensable, which means that anti-competitive behaviour is the only way to achieve the benefit. Surely, if that is the case, it would be better for the consumer, in whose name the Bill is being enacted, to have the highest possible threshold of benefit.

The Explanatory Notes open up avenues for further legal wrangling by lawyers, as they say the definition of benefit will be similar to that in the Competition Act and the Enterprise Act. As the two Acts use “benefit” in different ways, that will surely lead to confusion. Is the use of the word “similar” because it is not possible to say “same”, in the light of the divergent terms that appear in these two Acts? Without it, there seems to be room for legal ambiguity. At the very least, there should be an explanation in the Bill that establishes “benefits” as having the same definition as in the Competition Act.

I know that all noble Lords want the Bill to be implemented and effective with all possible speed, to make this country a world leader in digital start-ups. However, it needs to be amended to avoid legal confusion and unnecessary delay by world players that have everything to gain from protecting their dominant position in markets.

Lord Fox Portrait Lord Fox (LD)
- Hansard - - - Excerpts

My Lords, on the pretext that he would not be here, my noble friend passed responsibility for this group on to me. As noble Lords can see, he is “not” here. This is a long group and my noble friend managed to attach his name to every amendment in it, with the exception of the two proposed by the Minister, so I apologise if I give a slightly long speech on his behalf.

I spoke at Second Reading, but I was not here for the first day in Committee, as I was in the Chamber speaking to the main business there. My noble friend has tabled Amendments 38 and 41, on countervailing benefits; Amendment 43, on goods and services; Amendments 49, 50 and 51, on final offers; and Amendment 107, on injunctions. He also supports Amendments 36, 39 and 40 from the noble Baroness, Lady Jones, which seek to restore the status quo of Clause 29.

In Clause 29, as we know, there is an overarching provision that enables SMS designated firms to push back on regulatory decisions through a countervailing benefits exemption. This is, in our opinion, a potential legal loophole for big tech to challenge conduct requirements through lengthy, tactical legal challenges. We just heard an example of how similar measures can be employed. This is a significant loophole, not a small one, and it would require the CMA to close a conduct investigation into a breach of conduct requirement when an SMS firm is able to prove that the anti-competitive conduct in question produces benefits which supposedly outweigh the harms, and that the conduct is “proportionate”—that word again—to the realisation of those benefits. It has the potential to tie up CMA resources and frustrate the intent of the legislation. It is critical that these provisions do not inadvertently give designated firms immunity from CMA decisions. We heard from other speakers that the scale of resources at the command of these companies far outweighs the resources that the CMA would be capable of summoning. That inevitably leads to the ability to clog things up.

As the noble Baroness, Lady Jones, explained, the Government added amendments to the Bill on Report in the Commons that could further weaken the ability of the DMU to push back against spurious claims of consumer benefit. The removal of the term “indispensable” may weaken the regulator’s ability to rebuff these claims as, by analogy with competition law, the use of the term “indispensable” is likely to require a high standard for firms to meet; therefore, the standard is now lower.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, one of the arguments that has been advanced—I did not make it in my remarks because I forgot—is that part of the problem with changing the word from “indispensable” to what is now in the Bill is that the current phrase has not been tested in the courts, whereas “indispensable” has. The argument that changing from “indispensable” to what we have now provides clarity is one that is really hard for people to accept, because the clarity it is providing is not, seemingly, in everyone’s interests. That is part of the problem here.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

If “indispensable” and purely “benefit” are the same, why was the change made on Report in the Commons?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I was really interested in the introduction of the word “unknown”. The noble Lord, Lord Lansley, set out all the different stages and interactions. Does it not incentivise the companies to call back information to this very last stage, and the whole need-for-speed issue then comes into play?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will revert first to the questions about the word “indispensable”. As I have said, the Government consulted very widely, and one of the findings of the consultation was that, for a variety of stakeholders, the word “indispensable” reduced the clarity of the legislation.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

So it is not the same?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Before my noble friend answers that, can he shed some light on which stakeholders feel that this is unclear?