Overseas Operations (Service Personnel and Veterans) Bill Debate

Full Debate: Read Full Debate
Department: Ministry of Defence

Overseas Operations (Service Personnel and Veterans) Bill

Lord Clement-Jones Excerpts
No legislation designed to deliver on an overall policy intention to reassure our service personnel in the event that they are deployed overseas can deliver on that intention in this part of the 21st century without engaging the issues which this amendment addresses. Without this or a similar amendment, I fear that this legislation will be out of date as soon as it receives Royal Assent. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD) [V]
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Browne of Ladyton, in supporting his Amendment 32, which he introduced so persuasively and expertly. A few years ago, I chaired the House of Lords Select Committee on AI, which considered the economic, ethical and social implications of advances in artificial intelligence. In our report published in April 2018, entitled AI in the UK: Ready, Willing and Able?, we addressed the issue of military use of AI and stated:

“Perhaps the most emotive and high-stakes area of AI development today is its use for military purposes”,


recommending that this area merited a “full inquiry” on its own. As the noble Lord, Lord Browne of Ladyton, made plain, regrettably, it seems not yet to have attracted such an inquiry or even any serious examination. I am therefore extremely grateful to the noble Lord for creating the opportunity to follow up on some of the issues we raised in connection with the deployment of AI and some of the challenges we outlined. It is also a privilege to be a co-signatory with the noble and gallant Lord, Lord Houghton, who too has thought so carefully about issues involving the human interface with technology.

The broad context, as the noble Lord, Lord Browne, has said, is the unknowns and uncertainties in policy, legal and regulatory terms that new technology in military use can generate. His concerns about complications and the personal liabilities to which it exposes deployed forces are widely shared by those who understand the capabilities of new technology. That is all the more so in a multilateral context where other countries may be using technologies that we would either not deploy or the use of which could create potential vulnerabilities for our troops.

Looking back to our report, one of the things that concerned us more than anything else was the grey area surrounding the definition of lethal autonomous weapon systems—LAWS. As the noble Lord, Lord Browne, set out, when the committee explored the issue, we discovered that the UK’s then definition, which included the phrase

“An autonomous system is capable of understanding higher-level intent and direction”,


was clearly out of step with the definitions used by most other Governments and imposed a much higher threshold on what might be considered autonomous. This allowed the Government to say:

“the UK does not possess fully autonomous weapon systems and has no intention of developing them. Such systems are not yet in existence and are not likely to be for many years, if at all.”

Our committee concluded that, in practice,

“this lack of semantic clarity could lead the UK towards an ill-considered drift into increasingly autonomous weaponry.”

This was particularly in light of the fact that, at the UN Convention on Certain Conventional Weapons group of governmental experts in 2017, the UK opposed the proposed international ban on the development and use of autonomous weapons. We therefore recommended that the UK’s definition of autonomous weapons should be realigned to be the same or similar with that being used by the rest of the world. The Government, in their response to the report of the committee in June 2018, replied that:

“The Ministry of Defence has no plans to change the definition of an autonomous system.”


They did say, however,

“The UK will continue to actively participate in future GGE meetings, trying to reach agreement at the earliest possible stage.”


Later, thanks to the Liaison Committee, we were able on two occasions last year to follow up on progress in this area. On the first occasion, in reply to the Liaison Committee letter of last January which asked,

“What discussions have the Government had with international partners about the definition of an autonomous weapons system, and what representations have they received about the issues presented with their current definition?”


The Government replied:

“There is no international agreement on the definition or characteristics of autonomous weapons systems. Her Majesty’s Government has received some representations on this subject from Parliamentarians”.


They went on to say:

“The GGE is yet to achieve consensus on an internationally accepted definition and there is therefore no common standard against which to align. As such, the UK does not intend to change its definition.”


So, no change there until later in the year in December 2020, when the Prime Minister announced the creation of the autonomy development centre to,

“accelerate the research, development, testing, integration and deployment of world-leading AI,”

and the development of autonomous systems.

In our follow-up report, AI in the UK: No Room for Complacency, which was published in the same month, we concluded:

“We believe that the work of the Autonomy Development Centre will be inhibited by the failure to align the UK’s definition of autonomous weapons with international partners: doing so must be a first priority for the Centre once established.”


The response to this last month was a complete about-turn by the Government, who said:

“We agree that the UK must be able to participate in international debates on autonomous weapons, taking an active role as moral and ethical leader on the global stage, and we further agree the importance of ensuring that official definitions do not undermine our arguments or diverge from our allies.”


They go on to say:

“the MOD has subscribed to a number of definitions of autonomous systems, principally to distinguish them from unmanned or automated systems, and not specifically as the foundation for an ethical framework. On this aspect, we are aligned with our key allies. Most recently, the UK accepted NATO’s latest definitions of ‘autonomous’ and ‘autonomy’, which are now in working use within the Alliance. The Committee should note that these definitions refer to broad categories of autonomous systems, and not specifically to LAWS. To assist the Committee we have provided a table setting out UK and some international definitions of key terms.”