Artificial Intelligence: Regulation

Viscount Colville of Culross Excerpts
Thursday 17th October 2024

(2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, this Government have pledged to recalibrate trade relations with the EU. However, the new EU AI legislation is much more prescriptive than the regulation proposed by the Government. How will the Government ensure that UK-based AI organisations with operations in the EU, or which deploy AI into the EU, will be aligned with EU regulation?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

As the noble Viscount points out, the EU regulation has gone in a somewhat different direction in taking a very broad approach and not a sector-specific approach. In contrast, the US looks as though it is going down a similar sort of route to the one that we are taking. Of course, there will be a need for interoperability globally, because this is a global technology. I think that there will be consultation and interactions with both those domains as we consider the introduction of the AI Act, and we are starting an extensive consultation process over the next few months.

King’s Speech (4th Day)

Viscount Colville of Culross Excerpts
Monday 22nd July 2024

(4 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, as many other noble Lords have said, artificial intelligence will revolutionise our economy and our society during the next decade. It will radically improve our productivity, research capability and delivery of public services, to name but a few, so I am pleased that the digital information and smart data Bill will enable innovative uses of data to be safely developed and deployed.

I hope that this Bill will begin to address the wider risks AI poses to us all unless it is developed and released safely. This Government need to ensure that AI develops to support our economy and society, and that it does not take society in dangerous and unintended directions. At all stages of the training and deployment of AI, there are economic and social risks. There are dangers the whole way through the supply chain, from the initial data ingestion of the massive datasets needed to set up these foundation models to their training and deployment, which I hope will begin to be addressed by the Bill.

My concern is that there can be differences in the inputting and modification of AI models that humans do not consider significant, but which could have major and possibly adverse effects on the behaviour of AI systems. It is essential that formal verification techniques are guaranteed throughout the whole process to prove their safety at all stages of the process.

However, the massive costs of training and developing these models, which can run into billions of pounds, have put huge pressure on the tech companies to monetise them and to do so quickly. This has led to rapid developments of systems, but underinvestment in safety measures. Many of us were impressed when at the Bletchley summit last year the previous Government obtained voluntary guarantees from the big AI developers to open up their training data and allow the latest generative AI models to be reviewed by the AI Safety Institute, allowing third-party experts to assess the safety of models. However, since then the commitment has not been adhered to. I am told that three out of four of the major foundation model developers have failed to provide pre-release access for their latest frontier models to the AI Safety Institute.

The tech companies are now questioning whether they need to delay the release of their new models to await the outcome of the institute’s safety tests. In a hugely competitive commercial environment, it is not surprising that the companies want to deploy them as soon as possible. I welcome the Government’s commitment during the election campaign to ensure that there will be binding regulation on big developers to ensure safe development of their models. I look forward to the Secretary of State standing by his promise to put on a statutory footing the release of the safety data from new frontier models.

However, these safety measures will take great expertise to enforce. The Government must give regulators the resources they need to ensure that they are effective. If the Government are to follow through with AI safety oversight by sectoral regulators, I look forward to the setting up of the new regulatory innovation office, which will both oversee where powers overlap and pinpoint lacunae in the regulation. However, I would like to hear from the Minister the extent of the powers of this new office.

I hope that at next year’s French summit on AI the Government will be at the centre of the development of standards of safety and will push for the closest international collaboration. There needs to be joint evaluations of safety and international co-operation of the widest kind—otherwise, developers will just go jurisdiction shopping—so the Government need not just to work closely with the US Artificial Intelligence Safety Institute and the new EU AI regulator but to ensure transparency. The best way to do this is to involve multi-stakeholder international organisations, such as the ISO and the UN-run ITU, in the process. It might be slower, but it will give vital coherence to the international agreement for the development of AI safety.

I am glad to hear the Minister say that the Government will lead a drive to make this country the centre of the AI revolution. It is also good that DSIT will be expanded to bring in an incubator for AI, along with the strategy for digital infrastructure development. I hope that this will be combined with support for the creative industries, which generated £126 billion of revenue last year and grew by 6%, an amazing performance when we look at the more sluggish performance of much of the rest of the economy. I hope that members of the creative industries will be on the Government’s new industrial strategy council and that the government-backed infrastructure bank will look not just at tangible assets but at the less tangible assets that need supporting and encouraging across the creative industries.

To bring together AI and the creative industries, the Government need to develop a comprehensive IP regime for datasets used to train AI models, as the noble Lord, Lord Holmes, just told us. There has been much in the press about the use of data without the creator’s consent, let along remuneration. I hope that DSIT and DCMS will come together to generate an IP regime that will have data transparency, a consent regime and the remuneration of creators at its heart.

I hope that the gracious Speech will lead us into an exciting new digital area where Britain is a leader in the safe, transparent development and rollout of a digital revolution across the world.

Deepfakes: General Election

Viscount Colville of Culross Excerpts
Wednesday 8th May 2024

(7 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

First, let me absolutely endorse the noble Lord’s sentiment: this is a deplorable way to behave that should not be tolerated. From hearing the noble Lord speak of the actions, my assumption is that they would fall foul of the false communications offence under Section 179 of the Online Safety Act. As I say, these actions are absolutely unacceptable.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, noble Lords will be aware of the threat of AI-generated deepfake election messages flooding the internet during an election campaign. At the moment, only registered users have to put a digital imprint giving the provenance of the content on unpaid election material. Does the Minister think that a requirement to put a digital imprint on all unpaid election material should be introduced to counter fake election messages?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

The noble Viscount is right to point to the digital imprint regime as one of the tools at our disposal for limiting the use of deepfakes. I think we would hesitate to have a blanket law that all materials of any kind would be required to have a digital imprint on them—but, needless to say, we will take away the idea and consider it further.

I look forward to noble Lords’ contributions to this debate. There is not one answer to the questions we are posing, but I beg to move the amendment.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I have put my name to Amendment 48 in the name of the noble Baroness, Lady Jones of Whitchurch, and I support dropping Clause 29 from the Bill.

These amendments are also about speeding up the process of stopping anti-competitive behaviour by the tech companies. It is essential that no hostages to fortune are given for tech company lawyers to drag out the process, as many noble Lords said, particularly in the first group.

I want noble Lords to bear in mind that, for every big tech company, every week they succeed in delaying a decision against their anti-competitive practices is one in which they earn millions of pounds, while their competitors are left struggling in so many areas. Speed is of the essence.

As a former newspaper journalist, my most immediate field of concern is local and regional media, which are suffering from the anti-competitive behaviour of the tech companies. There has been a collapse in local newspapers over the past decade and in the next three years this will turn in to a major exodus, with huge areas of the country becoming local news deserts with nobody reporting on local councils, courts and other important civic activities.

The Digital Markets Taskforce study on digital advertising found that the tech companies had used network effects and economies of scale to dominate the market. It concluded that the “more vibrant competition” in the market would improve

“the bargaining power of online news publishers”,

which would

“improve the health and sustainability of journalism in the UK”.

In turn, this would

“contribute positively to the effectiveness and integrity of our democracy”.

On top of this, much of the news content generated by these media companies is used by tech platforms either for free or for little remuneration.

I have long campaigned for the final offer mechanism to be available to the CMA as a powerful deterrent against anti-competitive behaviour by the tech companies, but surely all deterrents are more effective if there is a realistic chance that they will be deployed, and in a short time. Once the CR requirements on an SMS have been imposed, breached and reported, the CMA should be in a good position to know whether the designated SMS company will take the long or short road to a solution. Amendment 48 would allow the CMA to issue an enforcement order, decide whether that has been breached and investigate the breach, if it feels that it will lead to a satisfactory resolution to the company’s behaviour. However, if, earlier in the process, the solution is not going to be possible, the regulator needs the power to bring forward its ultimate deterrent. No SMS will want to have the final offer mechanism imposed on it, and I understand that the CMA is equally reluctant to deploy it, but the more pressing the threat the more likely it is that the DMU investigation will be brought to a quick and effective resolution.

I know that these companies will fight tooth and nail to preserve their massive profits resulting from the anti-competitive behaviours. It might be useful for the Committee if I give just one really shocking example of how effective these delaying actions can be. The salutary lesson is the story of a nascent shopping comparison site, Foundem, based in London and founded in 2005, which was doing very well until 2008, when it was massively deprioritised on Google Search, at about the same time that Google Shopping, the search engine’s own shopping comparison site, was set up. Foundem issued a complaint to the EU Commission in 2009 about anti-competitive behaviour by Google. The Commission set up an investigation and, three years later, after many legal arguments, Google was given a preliminary assessment—similar, I imagine, to an SMS designation. Rules were then laid down for the company to follow, but within six months market tests revealed that it was not tackling the anti-competitive behaviour. The response was dragged out by Google until 2016, when it was given a supplementary statement of objectives, which were also heavily fought by the search engine.

Finally, on 27 June 2017, the EU imposed a record €2.4 billion fine on Google for violating EU competition law. However, the company appealed, first to the EU General Court and then to the Court of Justice of the European Union. Final judgment on the case has yet to be issued. Meanwhile, Foundem exists in order to fight the case, but it suspended all its services eight years ago. This is a 15-year David-versus-Goliath battle with a company, some of whose activities CMA might have to designate. This legislation must be drafted to ensure that the process brings results, and fast, if small digital competitors are to have a chance of surviving.

Already the CMA estimates that the designation process will not become operational until June 2025. I know that the hope is to set up a designation process at the same time as negotiating the conduct requirements, but that could still take up to nine months to implement on the SMSs. Meanwhile, many of the smaller media outlets I talked about earlier will have gone under.

The same arguments for legal delay by tech companies must apply to Clause 29, which introduces the concept of countervailing benefits. I do not understand the need for Clause 29. Clearly, the balance between consumer benefit and anti-competitive behaviour will have been looked at as part of the SMS designation process, which is clearly set out in the Bill. Does the Minister think that our world-class regulator will ignore these considerations in the initial process? If they will be considered then, why introduce this clause for consideration all over again? I have already explained the need for speed in the CMA’s process. This exemption can only play into the hands of the tech companies to draw out the processes and hold up the prospect of many more companies like the start-up shopping search website Foundem being littered by the digital wayside. I ask the Government to seriously consider taking Clause 29 out of the Bill.

However, I support the fallback in Amendment 40, to have the word “indispensable” inserted into the clause. Your Lordships’ Committee has heard that “indispensable” was taken out on Report in the other place. The Minister has said that the simple threshold of “benefit” is already established in Section 9 of the Competition Act 1998 and Section 134(7) of the Enterprise Act. However, the former talks of an “indispensable benefit” and the latter just of a “benefit”. The Minister says that the two thresholds are the same; clearly, they are not.

The new definition of the grounds on which anti-competitive conduct can be permitted states that

“those benefits could not be realised without the conduct”.

It requires only that anti-competitive conduct be necessary, rather than indispensable, which means that anti-competitive behaviour is the only way to achieve the benefit. Surely, if that is the case, it would be better for the consumer, in whose name the Bill is being enacted, to have the highest possible threshold of benefit.

The Explanatory Notes open up avenues for further legal wrangling by lawyers, as they say the definition of benefit will be similar to that in the Competition Act and the Enterprise Act. As the two Acts use “benefit” in different ways, that will surely lead to confusion. Is the use of the word “similar” because it is not possible to say “same”, in the light of the divergent terms that appear in these two Acts? Without it, there seems to be room for legal ambiguity. At the very least, there should be an explanation in the Bill that establishes “benefits” as having the same definition as in the Competition Act.

I know that all noble Lords want the Bill to be implemented and effective with all possible speed, to make this country a world leader in digital start-ups. However, it needs to be amended to avoid legal confusion and unnecessary delay by world players that have everything to gain from protecting their dominant position in markets.

Lord Fox Portrait Lord Fox (LD)
- Hansard - - - Excerpts

My Lords, on the pretext that he would not be here, my noble friend passed responsibility for this group on to me. As noble Lords can see, he is “not” here. This is a long group and my noble friend managed to attach his name to every amendment in it, with the exception of the two proposed by the Minister, so I apologise if I give a slightly long speech on his behalf.

I spoke at Second Reading, but I was not here for the first day in Committee, as I was in the Chamber speaking to the main business there. My noble friend has tabled Amendments 38 and 41, on countervailing benefits; Amendment 43, on goods and services; Amendments 49, 50 and 51, on final offers; and Amendment 107, on injunctions. He also supports Amendments 36, 39 and 40 from the noble Baroness, Lady Jones, which seek to restore the status quo of Clause 29.

In Clause 29, as we know, there is an overarching provision that enables SMS designated firms to push back on regulatory decisions through a countervailing benefits exemption. This is, in our opinion, a potential legal loophole for big tech to challenge conduct requirements through lengthy, tactical legal challenges. We just heard an example of how similar measures can be employed. This is a significant loophole, not a small one, and it would require the CMA to close a conduct investigation into a breach of conduct requirement when an SMS firm is able to prove that the anti-competitive conduct in question produces benefits which supposedly outweigh the harms, and that the conduct is “proportionate”—that word again—to the realisation of those benefits. It has the potential to tie up CMA resources and frustrate the intent of the legislation. It is critical that these provisions do not inadvertently give designated firms immunity from CMA decisions. We heard from other speakers that the scale of resources at the command of these companies far outweighs the resources that the CMA would be capable of summoning. That inevitably leads to the ability to clog things up.

As the noble Baroness, Lady Jones, explained, the Government added amendments to the Bill on Report in the Commons that could further weaken the ability of the DMU to push back against spurious claims of consumer benefit. The removal of the term “indispensable” may weaken the regulator’s ability to rebuff these claims as, by analogy with competition law, the use of the term “indispensable” is likely to require a high standard for firms to meet; therefore, the standard is now lower.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, one of the arguments that has been advanced—I did not make it in my remarks because I forgot—is that part of the problem with changing the word from “indispensable” to what is now in the Bill is that the current phrase has not been tested in the courts, whereas “indispensable” has. The argument that changing from “indispensable” to what we have now provides clarity is one that is really hard for people to accept, because the clarity it is providing is not, seemingly, in everyone’s interests. That is part of the problem here.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

If “indispensable” and purely “benefit” are the same, why was the change made on Report in the Commons?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I was really interested in the introduction of the word “unknown”. The noble Lord, Lord Lansley, set out all the different stages and interactions. Does it not incentivise the companies to call back information to this very last stage, and the whole need-for-speed issue then comes into play?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I will revert first to the questions about the word “indispensable”. As I have said, the Government consulted very widely, and one of the findings of the consultation was that, for a variety of stakeholders, the word “indispensable” reduced the clarity of the legislation.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

So it is not the same?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Before my noble friend answers that, can he shed some light on which stakeholders feel that this is unclear?

Digital Markets, Competition and Consumers Bill

Viscount Colville of Culross Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, at the opening of this Committee stage, I want to repeat, rather in the same way as the noble Baroness, Lady, Jones, what I said on Second Reading: we broadly welcome this Bill. In fact, since the Furman report was set up five years ago, we have been rather impatient for competition law in the digital space to be reformed and for the DMU to be created.

At the outset, I also want to thank a number of organisations—largely because I cannot reference them every time I quote them—for their help in preparing for the digital markets aspects of the Bill: the Coalition for App Fairness, the Public Interest News Foundation, Which?, Preiskel & Co, Foxglove, the Open Markets Institute and the News Media Association. They have all inputted helpfully into the consideration of the Bill.

The ability to impose conduct requirements and pro-competition interventions on undertakings designated as having strategic market status is just about the most powerful feature of the Bill. One of the Bill’s main strengths is its flexible approach, whereby once a platform is designated as having SMS, the CMA is able to tailor regulatory measures to its individual business model in the form of conduct requirements and pro-competition interventions, including through remedies not exhaustively defined in the Bill.

However, a forward-looking assessment of strategic market status makes the process vulnerable to being gamed by dominant platforms. The current five-year period does not account for dynamic digital markets that will not have evidence of the position in the market in five years’ time. It enables challengers to rebut the enforcer’s claim that they enjoy substantial and entrenched market power, even where their dominance has yet to be meaningfully threatened. Clause 5 of the Bill needs to be amended so that substantial and entrenched market power is based on past data rather than a forward-looking assessment. There should also be greater rights to consultation of businesses that are not of SMS under the Bill. As the noble Baroness, Lady Jones, said, this will be discussed later, under another group of amendments.

The provisions of Clause 5, as it is currently worded, risk causing problems for the CMA in practice. Part of the problem is the need for evidence to support a decision by the CMA of a market position over the entire five-year period. The five-year period requires current evidence of the position in the market in five years’ time. In dynamic digital markets such as these, no such evidence is likely to exist today. The CMA needs evidence to underpin its administrative findings. Where no such evidence exists, it cannot designate an SMS firm.

The CMA will have evidence that exists up to the date of the decision—evidence of the current entrenched position, market shares, barriers to entry, intellectual property rights and so on. In that respect, we support the noble Baroness, Lady Jones, with her Amendment 1, because it should of course include earlier investigations by the CMA. All that evidence exists today in 2024, but what the position will be in 2028 will need to be found and it has to be credible evidence to support a CMA decision under Clause 5. Particularly in fast-moving technology markets, the prediction of future trends is not a simple matter, so lack of sufficient evidence of the entrenched nature of a player at year 5 or over the entire period would prevent a rational decision-maker from being able to make a decision that the player will have SMS over the five-year period, as demanded by the Bill. Every designation and subsequent requirement or investigation imposed on the designated undertaking risks being subject to challenge on the basis of insufficient evidence.

As the Open Markets Institute says,

“the inevitably speculative nature of a forward-looking assessment makes the process vulnerable to being gamed by dominant platforms. For example, such firms may use the emergence—and even hypothetical emergence—of potential challengers to rebut the enforcer’s claim that they enjoy substantial and entrenched market power, even where their dominance has yet to be meaningfully threatened by those challengers”.

It gives the example of the rise of TikTok, which Meta has used in arguments to push back against anti-trust scrutiny:

“Yet while experiencing rapid growth in terms of user numbers, TikTok has so far failed to seriously challenge the economic dominance of Meta in online advertising (the basis of Meta’s market power), generating less”


than

“a tenth of the latter’s global revenues. Dominant platforms will also use emerging technologies—such as generative AI—to claim that their dominance is transitory, claims that will be difficult for the CMA to rebut given future uncertainty”.


Our Amendments 3, 4, 5 and 6—here I thank the noble Lord, Lord Vaux, for his support for them, and sympathise with him because I gather that his presence here today has been delayed by Storm Isha—suggest that the number of years should be removed and the provision clarified so that the assessment is made based on current evidence and facts. If the market position changes, the CMA has the power to revoke such designation in any event, on application from the SMS business, as provided for by Clause 16.

That is the argument for Amendments 3, 4, 5 and 6 in Clause 5. I look forward to hearing what the noble Viscount, Lord Colville, has to say on Amendment 7, which we very much support as well.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I have put down Amendment 7 to Clause 6 and, in later groups, amendments relating to Clauses 20 and 114. I will come to them later in Committee, but all of them have the aim of limiting the wide powers given to the Secretary of State in the Bill to intervene in the setting up of the processes for dealing with anti-competitive behaviour by the big tech companies. Amendment 7 would prevent the Secretary of State having broad powers in revising the criteria for establishing the designation of the SMS investigative process. My particular concern is about the power that the Minister might have to alter the criteria for the process in order to de-designate a company following heavy lobbying.

As this is my first intervention at this stage of the Bill, I join other noble Lords in saying that I too very much welcome it and the Government’s approach to dealing with anti-competitive behaviour by the big tech companies. In fact, I welcome it so much that I want to ensure that it is implemented as quickly and effectively as possible, to safeguard our digital start-ups and smaller digital companies.

The independence of the CMA is central to the effectiveness of the processes set out in Part 1. However, the huge powers given to the Minister in these chapters should worry noble Lords. They are proposing great powers of oversight and direction for the Secretary of State. I fear that these will undermine the independence of the CMA and dilute its ability to take on the monopolistic behaviour of the big tech companies. I hope that these amendments will go some way to safeguard the independence of the regulator.

I support the collaborative approach set out in the SMS and conduct requirement processes; it seems to be preferable to the EU’s Digital Markets Act, which is so much more broad-brush, with a much wider investigation into designated companies’ business activities. The Bill sets out a greater focus on a company’s particular activity and ensures that the CMA and the DMU work closely with stakeholders, including the tech companies which are going to be under investigation. However, despite this collaboration, it can only be expected that the companies involved in the process will want to give themselves the best possible chance of maintaining their monopolistic position. Clause 6 is central to the start of the process—after all, it sets out when a company can be considered to be under DMU oversight.

Designation as an SMS player means only that the company is subject to the jurisdiction or potential oversight of the DMU; it does not mean that it has done anything wrong. The deliberate aim of the Bill is to ensure that only large players are to be included in the SMS status. These criteria will not dictate how the investigation will go, so the criteria for designation as an SMS player does not need to be changed if the market changes. However, Clause 6(2) and (3) will give Ministers power to take criteria away from this section. This will mean that powerful tech players could fall outside the jurisdiction of the DMU and will not be open to SMS designation as a result. If the clause allowed only new criteria to be added, so that a wider scope of companies could be included, that would not be so bad. However, the ability to reduce the scope of the DMU’s potential designation should alarm noble Lords. These subsections give the tech companies huge powers to lobby the Secretary of State to ensure that there is not the possibility to designate them. Effectively, this would be a de-designation of these companies, which would defeat the purpose of the CR process before it has even got off the ground.

I am also concerned that the Secretary of State’s powers in this clause go against the law’s need to be normative: as a basic principle, it must apply to all the companies, without discrimination. The DMCC Bill is a law that applies only to those who qualify, but it is, in principle, generally applicable. Chapter 2 of Part 1 sets out a set of criteria that apply to all companies, but only a few will satisfy the criteria. The criteria for being an SMS requires enduring market power and a collection of other criteria. It is likely, as a result, that these will cover Microsoft, Amazon, Apple, Google and Facebook; each has enduring market power and qualifies for designation under the criteria in Clause 6. However, if that law can be varied by a Secretary of State to take away criteria, as it currently can, then the law can be made to apply to only a few companies. At the extreme, it could be altered to apply to only one or two. I am advised by lawyers that this is likely to be discriminatory.

Imagine if the law were varied so it applied only to a business that provides both a digital platform and home deliveries. This would mean it would apply only to Amazon, and the company would go to town lobbying against the change in criteria as discriminatory. Noble Lords must continually remind themselves that the Bill is taking aim at the biggest, most powerful companies in the world. I ask them to consider just how far these companies would go to put pressure on politicians and Ministers to safeguard their position, and how effective that pressure can be in changing their minds.

--- Later in debate ---
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I tabled Amendment 32 in my name, and I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for adding their names. I also thank the organisations that helped me work on these amendments. Amendment 32 to Clause 20 would stop the Secretary of State from revising the criteria for the conduct requirement process. These criteria are already very broad, but subsections (4) and (5) give the Minister huge scope to alter the types of behaviour expected from the SMS as part of the CR process.

Amendment 22, in my name and that of the noble Lord, Lord Clement-Jones, aims to respond to government concerns about removing Clause 20(4) and (5), which are that it will prevent the Minister future-proofing the CR criteria by allowing the CMA leeway to alter criteria in Clause 19, which will open the way for the imposition of conduct requirements.

I also support attempts to encourage interoperability between user and digital activity in any way possible, so I support Amendment 20, in the name of the noble Lord, Lord Lansley, and Amendment 21, in the name of the noble Lord, Lord Clement-Jones.

On my Amendment 32 in Clause 20, the conduct requirements for the process will be hard-fought by the tech companies. The collaborative nature of the Bill will mean that the SMS will be very involved in setting up the regime, but it will also be following every possible avenue to ensure that the requirements are not burdensome to its businesses. However, subsection (4) gives the Secretary of State broad and unlimited time to be subject to lobbying and to change the nature of the contact requirements.

I have already given an example in my speech on Amendment 7 to show the lengths to which tech companies will go to affect the decisions of politicians in establishing an SMS designation. This amendment will have a similar effect of thwarting their attempts to interfere in the CR process. Over the last decade, a number of cases have been brought against the big tech companies by the EU anti-competitive regimes. As part of that process to rectify the anti-competitive behaviour, the regulators have laid out behaviour for the companies under investigation. These are sets of rules aimed to force the companies to change their conduct and reduce their dominance in the market.

The process is very complicated, and small tweaks can make the difference between success and failure of the rules and their ability to control anti-competitive behaviour. Implementation takes time. Consultation on the rules between the DMU, the SMS and other stakeholders can mean it takes up to six months to put into action, then it takes another several months before the market study on how the new conduct regime criteria are working can be assessed. In the meantime, the SMS continues to make huge profits, while the smaller competitors continue to suffer the loss of market activity.

My concern about the clause is that, even if the CMA comes across a new type of harm and can see clearly what remedy would apply, it cannot create its own remedy under the clause. This is most unusual for a regulatory body. Usually, the breach of law is investigated, and the remedy tailored by that body to proportionately fit the harm identified. The regulator is usually granted the power to craft the remedy itself.

The Government are keen to build a system which is speedy and effective, and so there is the list of tools that can be used as remedies in Clause 20, which is useful, but, instead of a speedy, sensible mechanism which would be in the hands of the expert regulator of digital markets, an additional step has been put in place. That additional step—going back to the Secretary of State to create regulations—is a slower and more complicated way to craft this remedy. The DMU must be left to use its professional expertise to set these rules.

At a later stage, we will be talking about the suggestion of the noble Baroness, Lady Stowell, to have some parliamentary committee involvement. I wonder why on earth we cannot have parliamentary committee involvement when looking at these particular Secretary of State powers and the way that the DMU would use them.

To deal with the concerns that the Minister might have about the lack of future-proofing, I also tabled Amendment 22. Its aim is to respond to claims by the Government that the removal of Secretary of State powers in Clause 20 will stop the future-proofing. Noble Lords know that, in the fast-changing digital world, even the most comprehensive list of criteria might not include all possible eventualities; my amendment deals with those concerns. It stems from the powers of the CMA to look at the objectives of the conduct requirements in Clause 19(5), which are comprehensive: they cover “fair dealing”, “open choices” and “trust and transparency”. Only conduct requirements of the permitted type in Clause 19(5) can be imposed under Clause 20 on the CR regime.

Clause 20 is currently a permitted list for the regime; in future, the CMA may want to change the criteria needed to achieve the objectives of Clause 19(5) as markets inevitably change. I suggest to noble Lords that Amendment 22 will achieve that. I have argued that the fear of the Secretary of State succumbing to the lobbying powers of the big tech companies is something to worry about. This small amendment will solve that problem and give flexibility to the CR process, without the danger of political interference.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

My Lords, as this is the first time I have spoken in Committee, I declare that I chair the Communications and Digital Select Committee—but I am speaking in a personal capacity. This is quite an eclectic group of topics; it makes me wonder what will be in the group labelled “miscellaneous”.

I will talk about the leveraging principle, but before doing so, I acknowledge what has already been said about parliamentary accountability and the fact that I have an amendment in a later group. To pick up a point that the noble Viscount, Lord Colville, just made about his amendment to Clause 20, if we were to have a new Select Committee, there is no reason why, in the course of its business, it would not look at regulations being brought forward. I would expect there to be that sort of role for a Select Committee, but it would not replace the role of the Secretary of State in this context. We will come back to that when we get to the specific amendment.

The amendment on copyright is very interesting to me, not least because the Communications and Digital Committee is currently carrying out an inquiry on large language models. We are in the final stages of that inquiry and will publish our report very soon. We will have, I hope, some interesting things to say about copyright at that time.

I turn to my point on the leveraging principle; in particular, I will pick up on Amendments 26 and 27 in the name of the noble Baroness, Lady Jones. When the Communications and Digital Committee carried out our scrutiny of the Bill and held hearings in the summer, we looked at the leveraging principle and concluded that what was in the Bill was adequate; we did not propose any further changes being necessary. Noble Lords may remember that, at Second Reading, I raised concerns about how the Government had diluted various bits of the Bill that we, as a committee, had said, “Do not do that”. As I understand it, they have not diluted the leveraging principle. However, I am a great believer in judging people by their actions rather than by what they say. Over the last few weeks, I have been very interested in the various representations that have been made to me and others from the different challenger firms and industry bodies in this area. I see and am sympathetic to their concerns on this topic.

Only today, I was interested to read the Bloomberg daily newsletter on tech matters, which refers to the recent case in the US in which Apple has been forced to make some changes to its 30% fee policy. It has already started introducing things that make that almost meaningless to those who might benefit from it. The newsletter explains what people have to do to use a different payment system from Apple’s and avoid the 30% fee. It says:

“In order for developers to include a website link in their apps to an outside payment system, they’ll first need to submit a request form to Apple. If approved, the link can only be displayed once within the app. It must look like a text URL—meaning it can’t be a candy-colored button that says ‘Use PayPal’—and the text itself must match one of seven templates”.


It continues:

“When clicked, the link will surface a warning from Apple about the risks of transacting with third-party websites, with ‘continue’ or ‘cancel’ buttons. The website has to open in the device browser, rather than from a pop-up within the app, where, depending on the type of service, a user can sign in or register for a new account”;


in other words, you will not bother by the time you have got through all that.

That was a long-winded way to say that I am minded to support what the noble Baroness, Lady Jones, is seeking to do with the leveraging principle here. A safeguard is necessary, but, as I said at the beginning, I am speaking in my own personal capacity.

Artificial Intelligence: Regulation

Viscount Colville of Culross Excerpts
Monday 4th December 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I do not think “perverse” is justified. GDPR Article 22 addresses automated individual decision-making, but, as I am sure the noble Lord knows, the DPDI Bill recasts Article 22 as the right to specific safeguards rather than a general prohibition on automated decision-making, so that subjects have to be informed about it and can seek a human review of decisions. It also defines meaningful human involvement.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

When I asked the Minister in October why deepfakes could not be banned, he replied that he could not see a pathway to do so, as they were developed anywhere in the world. In the Online Safety Act, tech companies all over the world are now required not to disseminate harms to children. Why can the harms of deepfakes not be similarly proscribed?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I remember the question. It is indeed very important. There are two pieces to preventing deepfakes being presented to British users: one is where they are created and the second is how they are presented to those users. They are created to a great extent overseas, and we can do very little about that. As the noble Viscount said, the Online Safety Act creates a great many barriers to the dissemination and presentation of deepfakes to a British audience.

King’s Speech

Viscount Colville of Culross Excerpts
Tuesday 14th November 2023

(1 year, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I was encouraged by the Government’s White Paper on AI published earlier this year, with its stated intention of extending AI regulation throughout the economy.

The gracious Speech mentioned the UK leading international discussion on developing AI safely, and of course much has been made of the Bletchley AI summit. While that was happening, I joined a lower-profile but equally interesting fringe AI summit, attended by hundreds of young AI developers, ethicists and policy thinkers from all over the world. They pointed out that AI is already deployed across multiple government systems and the private sector to increase productivity and effectiveness in the areas of medicine, science and employment, to mention a few.

Although the Government have had success in bringing together disparate players to look ahead at AI, the real threat is already here, as my noble friend Lady Kidron said. The National Cyber Security Centre’s annual review highlights these threats only too well. It says that the

“large language models … will almost certainly … make the spread of disinformation easier; and that deepfake campaigns are likely to become more advanced”

by the time of the next general election.

Generative AI can already make videos of people saying anything in a convincing iteration of their voice and features. At the moment, AI models to simulate somebody’s voice on audio are freely available; their video equivalent is available commercially but will be freely available within months. It will soon be possible to make videos of anybody saying anything and spread them across the internet. The possibility of this technology being used to make deepfake political statements will cause havoc in the coming elections, leaving voters not knowing what to believe.

When I asked the Minister, on 24 October, what could be done to ban them, he told me that it was not possible because they are developed abroad. However, in the Online Safety Act, the law now requires foreign players to abide by our requirements to prevent online harm to children and illegal harms to all. I echo other noble Lords in pointing out that there is no legislation in the gracious Speech to show that the Government are taking this very present threat at all seriously.

Any new AI law needs to ensure that this country is pro innovation in the hope of making us a global superpower. The White Paper laid out important principles for understanding how models should being developed; however, the successful development of AI in Britain will depend on effective regulation across every sector of the economy where the models are being trained—and before they are deployed.

The White Paper says that existing regulators could be given extra powers to deal with AI. That is fine for areas of the economy which already have strong regulation, such as medicine, science and air travel, but AI is already being deployed in sectors such as education and employment, where the regulators do not have the powers or resources to look at the ways in which it is being used in their fields. For instance, who is the employment regulator who can look at the possible bias in robo-firing from companies and the management of employees by algorithm? Can the Minister say why no AI legislation is being brought forward in this new Parliament? What plans are there to enhance the AI-regulating powers in sectors where such regulation is very weak?

I am concerned not just by the way that AI models are being trained but by the state of the data being used to train them. Public trust is crucial to the way this data is used, and the new legislation must help build that trust, so it will be very important to get the digital protection Bill right. Generative AI needs to be trained on millions, if not billions, of pieces of data, which, unless scrutinised rigorously, can be biased or—worse—toxic, racist and sexist. The developers using data to train AI models must be alert to what can happened if bad data is used; it will have a terrible impact on racial, gender, socioeconomic and disabled minorities.

I am concerned that crucial public trust in technology will be damaged by some of the changes to data protection set out in the Bill. I fear that the new powers available to the Secretary of State to issue instructions and to set out strategic priorities to the regulator will weaken its independence. Likewise, the reduction in data protection officers and the impact statements for all data that is not high risk also threaten to damage public trust in the AI models on which they are trained.

We see this technology evolving so fast that the Government must increase the focus on the ethics and transparency of the use of data. The Government should encourage senior members of organisations to look at the whole area of data, from data assurance to its exploitation by AI. I know that the Government want to reduce the regulatory burden on data management for businesses, but I suggest that it will also make them more competitive if they have a reputation for good control and use of the data they hold.

So much of our concern in the digital space is with the extraordinary powers that the tech companies have accumulated. I am very pleased that the digital markets Bill is giving the Digital Markets Unit powers to set up market inquiries into anti-competitive practices that harm UK markets. Building on what was said by the noble Baroness, Lady Stowell, for me one of the most egregious, and one of the most urgent, issues is the market in journalistic content. Spending on advertising for regional newspapers in this country has declined from £2.6 billion in 1990 to £240 million at the end of last year. As a result, the country’s newspapers are closing and journalism is being restricted by the move of advertising to the big tech companies. That is the real problem—it is not, as the noble Lord, Lord Black, said, caused by competition from the BBC.

This is compounded by those companies aggregating news content generated by news creators and not paying them a fair price. So I am pleased to see the introduction of a means for tech companies to make proportionate payment for journalistic content. I hope that the conduct requirement to trade on fair terms will be sufficient. However, if the final-offer mechanism has to be used to force an offer from the tech companies, the lengthy enforcement period means that it could take many years before the CMA is able to deploy it. There need to be strict time limits on every step if we are to make the FOM a credible incentive to negotiation. Like the noble Baroness, Lady Stowell, I ask for the CMA decisions to be appealable through the shorter judicial review process, rather than the longer merits standards asked for by the tech companies.

Finally, I am very pleased by the clauses to help digital subscribers, but I will make one plea. It is often very difficult to terminate a contract online with the “unsubscribe” link being hidden away in some digital corner. I suggest that we take the example of Germany, which requires contracts to have an easily accessible cancellation button on all digital contracts.

Artificial Intelligence: Regulation

Viscount Colville of Culross Excerpts
Tuesday 24th October 2023

(1 year, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I do not believe that anyone anywhere is advocating unregulated AI. The voluntary agreement is, of course, a United States agreement secured with the White House. We welcome it, although it needs to be codified to make it non-voluntary, but that will be discussed as part of the summit next week.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I would like to pick up on the point made by the noble Lord, Lord Clement-Jones, because Professor Russell also said that he would like to ban certain types of AI deepfakes. With elections looming in this country, can the Minister tell the House whether he thinks AI developers should be banned from creating software that allows the impersonation of people, particularly high-profile politicians?

Advanced Artificial Intelligence

Viscount Colville of Culross Excerpts
Monday 24th July 2023

(1 year, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I declare an interest as a freelance television producer. I too congratulate my noble friend Lord Ravensdale on having secured this debate.

Last night, I went on to the ChatGPT website and asked it to write me a speech on the subject in this debate that worries me—the threat that AI poses to journalism—and this is a paragraph that it came up with:

“AI, in its tireless efficiency, threatens to overshadow human journalism. News articles can be automated, and editorials composed without a single thought, a single beating heart behind the words. My fear is that we will descend into a landscape where news is stripped of the very human elements that make it relatable, understood, and ultimately, impactful”.


Many noble Lords might agree that that is a frighteningly good start to my speech. I assure them that, from now on, whatever I say is generated by me rather than ChatGPT.

Generative artificial intelligence has many benefits to offer broadcasting and journalism. For instance, the harnessing of its data processing power will allow a big change in coverage of next month’s World Athletics Championships in Budapest by ITN. Not only will it allow the voiceover to be instantly translated, but it will also be able to manipulate the British sports presenter’s voice to broadcast in six to seven languages simultaneously. This will bring cost savings in translation and presenter fees.

Already there is an Indian television service, Odisha TV, which has an AI-generated presenter that can broadcast throughout the night in several Indian languages. Synthetic voice-generated AI has already arrived and is available for free. The technology to manipulate an image so that a speaker’s lips are synchronised with the voice is commercially available, improving and becoming cheaper by the month. All these advances threaten on-screen and journalistic jobs.

However, noble Lords should be concerned by the threat to the integrity of high-quality journalism, an issue raised by my AI-generated introduction. We are now seeing AI accelerating trends in digital journalism, taking them further and faster than we would have thought possible a few years ago. Noble Lords have only to look at what is happening with AI search. At the moment, many of us will search via a browser such as Google, which will present us with a variety of links, but with AI search the information required is given at a quite different level of sophistication.

For instance, when asked, “What is the situation in Ukraine?”, the new Microsoft AI search tool will give an apparently authoritative three-paragraph response. It will have searched numerous news websites, scraped the information from those sites and sorted them into a three-paragraph answer. When the AI search engine was asked for the provenance of the information, it replied that

“the information is gathered from a variety of sources, including news organisations, government agencies and think tanks”.

Requests for more exact details of the news websites used failed to deliver a more specific answer. As a result, it is not possible for the user to give political weight to the information given, nor to discover its editorial trustworthiness. As many other noble Lords have mentioned, the ability to create deepfakes allows AI to synthesise videos of our public figures saying anything, whether true or not. There is nothing in the terms and conditions for the tech companies to ensure that the answers are truthful.

The very existence of quality journalism is at risk. Already we are seeing the newspaper industry brought to its knees by the big tech platforms’ near-monopoly of digital advertising spend. This has greatly reduced the advertising spend of newspapers, on which much of their revenue depends. Social media is aggregating content from established news sites without paying fees proportionate to the expense of gathering news. The effect on the industry is disastrous, with the closure of hundreds of papers resulting in local news deserts, where the proceedings of local authorities and magistrates are no longer reported to the public. The new AI technology is further exacerbating the financial threat to the whole industry. AI-generating companies can scrape for free the information from news websites, which are already facing the increasing costs of creating original journalistic content. Meanwhile, many AI sites, such as Microsoft’s new AI service, are charging up to $30 a month.

I have been involved in the Online Safety Bill, which has done a wonderful job, with co-operation from the Government and all Benches, to create a law to make the internet so much safer, especially for children. However, it does not do anything to make the internet more truthful. There needs to be pressure on the generative AI creators to ensure that information that they are giving is truthful and transparent. The leaders of our public service media organisations have written to Lucy Frazer, asking her to set up a journalist working group on AI bringing together the various stakeholders in this new AI world to work out a government response. The letter is addressed to DCMS but I would be grateful if the Minister, whose portfolio covers AI policies, could ensure that his department takes an active role in setting up this crucial group.

An election is looming on the horizon. The threat of a misinformation war will make it difficult for voters properly to assess policies and politicians. This war will be massively exacerbated by search AI. It is time for the Government to take action before the new generation of information technology develops out of control.