Autonomous Weapons

(asked on 5th September 2022) - View Source

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to the statement in the policy document Ambitious, Safe and Responsible - Our approach to the delivery of AI-enabled capability in Defence, published on 15 June, that there must be "context appropriate human involvement in weapons which identify, select and attack targets", what plans they have to elaborate on the concept of "context appropriate human involvement" to ensure that relevant officers in (1) the Ministry of Defence, and (2) HM Armed Forces, have operational guidance on the acceptability of particular weapons, practices and uses.


Answered by
Baroness Goldie Portrait
Baroness Goldie
This question was answered on 20th September 2022

MOD officials and Military colleagues are currently exploring processes for the delivery of the approaches set out in the Ambitious, Safe, Responsible policy. This will include a consideration of AI across the system lifecycle, including further elaboration of the concept of 'context appropriate human involvement'.

With respect to the acceptability of particular weapons, the Additional Protocol 1 (AP 1), Article 36 of the Geneva Convention 1977, requires States to determine whether new weapons, means or methods of warfare may be employed lawfully under International Law. The United Kingdom takes this obligation very seriously, and UK weapon reviews are undertaken by serving military lawyers on the staff of the Development Concepts and Doctrine Centre (DCDC). This assessment will then be fed into usage instructions and authorities on particular systems to ensure that the parameters of lawful and responsible use are fully understood in any particular case.

Reticulating Splines