To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Autonomous Weapons
Tuesday 20th September 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to the remarks by the Minister of State at the Ministry of Defence on 18 July (HC Deb col 688) that "autonomy is increasingly the key to the successful generation of overwhelming force in the battle space" and that "a more lethal force—even a bigger force—does not necessarily acquire more workforce in the future", what assessment they have made of (1) the compatibility of these remarks with the answer by Baroness Goldie on 1 November 2021 (HL Deb, col 995) that "UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement", and (2) the implications of these remarks for risks of strategic instability.

Answered by Baroness Goldie

AI may not inherently reduce workforce requirements, but it is likely to change the activities we need people to undertake. Across the workforce, AI and autonomous systems offer opportunities to remove people from ‘dull, dirty and dangerous’ tasks. This will enable us to focus our people on those areas where they can add particular value, in the context of Human Machine Teams. Machines are good at doing things right; people are good at doing the right things, and context-appropriate human involvement will be essential for the ethical and legal use of AI-enabled weapon systems. We strongly believe that such systems can and must be used lawfully and ethically, and we will promote security and stability by working closely with allies and partners to build consensus, promote a common vision for the safe, responsible and ethical use of these technologies globally, and push for compliance with International Humanitarian Law.


Written Question
Defence: Artificial Intelligence
Tuesday 20th September 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government what mechanisms they are considering for compliance or oversight of the use of artificial intelligence in defence; and, in particular, whether mechanisms they are considering include (1) an internal artificial intelligence regulator, (2) the conversion of the principles included in the Defence Artificial Intelligence Strategy, published on 15 June, to specific standards and procedures, and (3) a means to ensure that weapons systems developed, acquired or deployed by the Ministry of Defence comply with any such standards and procedures.

Answered by Baroness Goldie

The internal regulation and governance of standards for AI use in Defence are currently subject to extensive work across the Department. Officials and military colleagues are developing frameworks to assess risk and ensure compliance, across the full spectrum of AI functionality.

Key aspects of this will include requiring Front-Line Commands and equivalent organisations within Defence to appoint 'Accountable Officers' ensuring oversight for AI activity and developing the capacity of the Defence Artificial Intelligence Centre to provide technical oversight and coordinated advice to all units across Defence. We are exploring how the Defence Safety Authority will consider AI within the wider issues across its current remit as the existing independent safety regulator.


Written Question
Autonomous Weapons
Tuesday 20th September 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to the statement in policy document Ambitious, Safe and Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, that global governance for autonomous weapons systems is "a difficult task", how they plan to respond strategically to identified challenges of global governance in this area; and, in any such strategic response, how they intend to fulfil the aims set out in the Integrated Review of Security, Defence, Development and Foreign Policy, published on 16 March 2021, in relation to international legal, ethical and regulatory standards on responsible development and use of artificial intelligence.

Answered by Baroness Goldie

The Department is developing its plans to implement its Defence AI Strategy, to address broader strategic issues arising from the Integrated Review. We will work with partners to mitigate the potential impacts of AI, including its proliferation, misuse and potential for misunderstanding and miscalculation.

Particularly relevant fora include the UN-brokered discussions under the Convention on Certain Conventional Weapons (CCW), AI Partnership for Defence and NATO; as well as broader discussions on the development of AI within the Global Partnership on AI, UNESCO and the Council of Europe. Compliance with International Humanitarian Law will remain at the core of our current and future standards, which will be rigorously applied to all AI use in Defence.


Written Question
Autonomous Weapons
Tuesday 20th September 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to the statement in the policy document Ambitious, Safe and Responsible - Our approach to the delivery of AI-enabled capability in Defence, published on 15 June, that there must be "context appropriate human involvement in weapons which identify, select and attack targets", what plans they have to elaborate on the concept of "context appropriate human involvement" to ensure that relevant officers in (1) the Ministry of Defence, and (2) HM Armed Forces, have operational guidance on the acceptability of particular weapons, practices and uses.

Answered by Baroness Goldie

MOD officials and Military colleagues are currently exploring processes for the delivery of the approaches set out in the Ambitious, Safe, Responsible policy. This will include a consideration of AI across the system lifecycle, including further elaboration of the concept of 'context appropriate human involvement'.

With respect to the acceptability of particular weapons, the Additional Protocol 1 (AP 1), Article 36 of the Geneva Convention 1977, requires States to determine whether new weapons, means or methods of warfare may be employed lawfully under International Law. The United Kingdom takes this obligation very seriously, and UK weapon reviews are undertaken by serving military lawyers on the staff of the Development Concepts and Doctrine Centre (DCDC). This assessment will then be fed into usage instructions and authorities on particular systems to ensure that the parameters of lawful and responsible use are fully understood in any particular case.


Written Question
Defence: Artificial Intelligence
Thursday 4th August 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government, whether they will publish an annual list of all AI defence programmes they are working on, including the stage of implementation and the budget for each programme.

Answered by Baroness Goldie

While Defence is committed to be as transparent as possible about our use of AI technologies, there are no plans at present to publish an annual list of all AI defence programmes due to practical and security reasons.

Defence understands AI as a family of general-purpose technologies with ubiquitous potential applications from the back office to the battlespace. We will publish an AI 'concept playbook" later this year to help partners to understand the areas of Research and Development that we intend to prioritise. However, it is important to understand that in most cases AI will be an enabler for a broader system or capability (e.g. supporting more informed logistics planning) not a capability programme in itself.

Given the range of potential applications, it would not be practical to label and track all sub-elements of projects underway across Defence that include Autonomy or AI. Moreover, in some cases it would not be appropriate to disclose details of Defence capability programmes for security reasons.


Written Question
Defence: Artificial Intelligence
Thursday 4th August 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to their policy paper Ambitious, Safe, Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, what steps they are taking to ensure that (1) scientists, (2) developers, and (3) industry, can operate in an environment where there are adequate controls to prevent their research and technology from being used in ways which may be problematic.

Answered by Baroness Goldie

The Defence AI Strategy (published on 15 June 2022), set out our clear commitment to use AI safely, lawfully and ethically in line with the standards, values and norms of the society we serve. This is critical to promote confidence and trust among our people, our partners and the general public.

We will deliver this commitment through a range of robust people, process and technology measures, including: embedding our AI Ethics Principles throughout the entire capability lifecycle; independent scrutiny and challenge from our AI Ethics Advisory Panel; training to ensure our people understand and can appropriately mitigate AI-related risks; publishing as much information as possible about key safeguards (such as our approach to Test and Evaluation); specifying (including through Early Market Engagement) how and why we will utilise algorithms and applications; and ensuring there are effective pathways for individuals to raise ethical or safety concerns.

As we implement these commitments from the AI Strategy - and the associated 'Ambitious, Safe, Responsible' policy - Defence will continue to be outward facing, working with colleagues across the AI and technology industry to understand concerns and identify and embed best practice safeguards.


Written Question
Autonomous Weapons: Treaties
Thursday 27th January 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government what assessment they have made of the International Committee for the Red Cross’s analysis that a new legally-binding instrument, including prohibitions and positive obligations, is required to regulate autonomous weapons systems.

Answered by Baroness Goldie

We regularly engage with a wide range of stakeholders on lethal autonomous weapon systems (LAWS), including those - such as the International Committee of the Red Cross - that believe a new legally binding instrument on LAWS is necessary. The UK does not support calls for a legally binding instrument on LAWS. Our view remains that International Humanitarian Law (IHL) provides a robust, principle-based framework for the regulation of weapons development and use, and we will continue to engage at the UN Convention on Certain Conventional Weapons seeking to clarify the prohibitions and positive obligations around the use of autonomous weapon systems under IHL.


Written Question
Autonomous Weapons
Thursday 27th January 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government what assessment they have made of the potential impact of increasing autonomy in weapons systems on (1) civilian protection, and (2) compliance with international humanitarian law.

Answered by Baroness Goldie

The deployment in armed conflict of any weapon system - including one with autonomous functions - which does not distinguish between combatants and civilians would be contrary to International Humanitarian LAW (IHL) and therefore unlawful. We strongly believe that AI and autonomy within weapon systems can and must be used lawfully and ethically. Autonomous systems have the potential to support the better application of IHL by improving the evidence, analysis and timeliness of decision making.


Written Question
Autonomous Weapons
Thursday 27th January 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government whether they maintain the position that the UK does not possess fully autonomous weapon systems and has no intention of developing them.

Answered by Baroness Goldie

Our position on fully autonomous weapon systems is clear and unchanged. The UK does not possess fully autonomous weapon systems and has no intention of developing them.

When deploying autonomous weapon systems we will always ensure meaningful and context-appropriate human involvement across the system lifecycle from development to deployment, ensuring human responsibility for outcomes.


Written Question
Defence: Innovation and Technology
Thursday 27th January 2022

Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)

Question to the Ministry of Defence:

To ask Her Majesty's Government how they seek to reconcile a focus on tackling the proliferation of advanced military technologies with prioritising the development and integration of new technologies “required for near-peer, high-tech warfighting”, such as “AI-enabled autonomous capabilities” as identified in the Ministry of Defence’s Defence in a Competitive Age paper, published in March 2021.

Answered by Baroness Goldie

The Ministry of Defence is committed to developing and deploying AI-enabled systems responsibly and promoting responsible use worldwide.

The UK will work with allies and partners to address the issue of proliferation of advanced military technologies such as AI-enabled autonomous capabilities. This will include reinforcement of the disarmament and export control regimes, treaties and organisations; development of the means of preventing AI proliferation or misuse; and monitoring the risks of AI exacerbating existing counter-proliferation and arms control challenges. This will ensure that the opportunities gained from the development and integration of new technologies are balanced with appropriate controls.