Lethal Autonomous Robotics Debate
Full Debate: Read Full DebatePaul Flynn
Main Page: Paul Flynn (Labour - Newport West)Department Debates - View all Paul Flynn's debates with the Foreign, Commonwealth & Development Office
(11 years, 5 months ago)
Commons ChamberI should like to thank you, Mr Speaker, for allowing me this debate to bring to the House’s attention the issue of lethal autonomous robotics, or LARs, which are sometimes referred to as “killer robots”. I have the privilege of being the vice-chair of the all-party parliamentary group on weapons and the protection of civilians, and I wish to raise the issue of what plans the Government have to engage in international talks to try to limit, through means such as international regulation, the development and proliferation of such weapons. I believe that the UK has a key role to play in international talks and that without concerted international agreement and pressure it is unrealistic to anticipate that many individual states will pause in their drive for ever-increasing technological advantage.
This debate is timely because just over a fortnight ago, on 29 May, Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, presented a report on lethal autonomous robotics to the UN Human Rights Council in Geneva. The following day, 24 states took part in discussions on the report, and with the exception of the UK, they all agreed on the need for further debate. Germany and the United States were among those who expressed a particular willingness for further international discussions. Brazil and France urged the need for an arms control forum, and suggested using the convention on certain conventional weapons. That is a framework convention with protocols on specific issues. It is the mechanism that was used to make an international legal agreement to ban the use of blinding lasers before they were ever deployed.
Before returning to the need for international dialogue, I want briefly to explain what we mean by lethal autonomous robotics and why we need to take action now. I also want to highlight some of the concerns raised in the UN special rapporteur’s report. The term LARs refers to
“robotic weapons systems that, once activated, can select and engage targets without further intervention by a human operator”.
The key element is that the robot has the power to “choose” a target independently and to “decide” to use lethal force against that target. That element of full autonomy means that LARs represent more than just a game-changing development in weapons technology. They represent a revolution.
LARs have sometimes been grouped with modern unstaffed weapons systems, such as remotely piloted aircraft systems, sometimes called unmanned aerial vehicles but most are more commonly known as drones. However, they go a considerable step further than drones. LARs are fully autonomous weapons systems which, once activated, can select and use lethal force against targets without further human intervention. The key departure from existing military technology—the factor that differentiates LARs from unmanned weapons systems such as drones—is the absence of human intervention once a fully autonomous weapons system has been activated. A robot would be able to make the decision to kill a human being, which has never been the case before. For that reason, LARs would constitute not an upgrade of the weapons that are currently in our arsenals, but a fundamental change in the nature of war. LARs explode our legal and moral codes that assume that the decision-making power of life and death will be the responsibility of a human being, never a machine.
It is the natural horror of a scenario in which a robot could decide to kill a human that has led to the description of LARs as “killer robots”. They are also sometimes described as “fully autonomous weapons”. Whatever the label, no lethal, fully autonomous weapons system has yet been deployed, but we need urgent action now, before further technological development and investment make a race toward killer robots impossible to stop. Make no mistake: technological know-how is widespread, and it is estimated that more than 70 countries have military robotics programmes. The United Kingdom is a leader in the field of sophisticated high-tech military industries, and is therefore at the forefront of development of the types of technology that could be used in LARs.
Inevitably, much of the development of LARs worldwide is shrouded in secrecy, including development in the UK. What we do know is that weapons technology is developing at an ever-increasing pace, and it is therefore very difficult to determine how close we are to the production of LARs that are ready to be used. Weapons systems with various degrees of autonomy and lethality are already being developed. One is the UK’s Taranis system, a jet-propelled combat drone prototype that can search for, identify and locate enemies autonomously, and can defend itself against enemy aircraft without human intervention. It is clear that LARs are not a fantasy of science fiction, or a technology belonging to the distant future; they are a real possibility for our time.
The considered, comprehensive and balanced report by Christof Heyns, which was published on 9 April, raised a plethora of concerns about LARs. First, it drew attention to the moral dilemmas presented by them.
Does my hon. Friend agree that the gravest danger posed by these weapons is their perpetuation of the philosophy that might is right? Is it not the case that, while the use of sophisticated technology in certain countries against other, unsophisticated countries may secure victories in the short term, huge resentments will be built up because of that difference in technology, and will leave a legacy of continuing conflict?
My hon. Friend is right. There will be a huge imbalance between countries that have these technologies and the potential to use them, and countries that do not.
LARs increase the distance, physical and emotional, between weapons users and the lethal force that they inflict. Drones already offer the states that deploy them the military advantage of being able to carry out operations without endangering their own military personnel, and thus distance the operators from the action. LARs would take that a crucial step further by lessening the weight of responsibility felt by humans when they make the decision to kill. They could lead to a vacuum of moral responsibility for such decisions.
Secondly, LARs give rise to legal issues. Given that they would be activated by a human being but no human being would make the specific decision to deploy lethal force, it is fundamentally unclear who would bear legal responsibility for the actions performed by them. If the legal issues are not tackled, an accountability vacuum could be created, granting impunity for all LARs users. Furthermore, robots may never be able to meet the requirements of international humanitarian law, as its rules of distinction and proportionality require the distinctively human ability to understand context and to make subjective estimates of value. The open-endedness of the rule of proportionality in particular, combined with complex circumstances on a battlefield, could result in undesired and unexpected behaviour by LARs. It is clear that existing law was not written to deal with LARs.
Thirdly comes a multitude of terrifying practical concerns. The lowered human cost of war to states with LARs, as my hon. Friend the Member for Newport West (Paul Flynn) pointed out, could lead to the “normalisation” of armed conflict. A state with LARs could choose to pit deadly robots against human soldiers on foot, presenting the ultimate asymmetrical situation. States could be faced with the temptation of using LARs outside of armed conflict, finding themselves able to eliminate perceived “troublemakers” anywhere in the world at the touch of a button. LARs could be hacked or appropriated, possibly for use against the state, and they could malfunction, with deadly consequences.
This report corroborates the revolutionary difference between LARs and any previous weapons system, and proves the following: that our current understanding of the nature of war cannot support them; that our existing legislation cannot regulate them; and that we cannot predict the effects that they may have on our future world.
What is called for worldwide in response is both an urgent course of action, and a mutual commitment to inaction: immediate action to ensure transparency, accountability and the rule of law are maintained; and agreement to inaction in the form of a global moratorium on the testing, production, assembly, transfer, acquisition, deployment and use of LARs until an international consensus on appropriate regulations can be reached.
Will the Minister explain the Government’s position on the recommendations in the UN report. It calls on all states to do the following: put in place a national moratorium on lethal autonomous robotics; participate in international debate on lethal autonomous robotics, and in particular to co-operate with a proposed high level panel to be convened by the UN High Commissioner for Human Rights; commit to being transparent about internal weapons review processes; and declare a commitment to abide by international humanitarian law and international human rights law in all activities surrounding robotic weapons.
At the UN Human Rights Council in Geneva, a large number of states expressed the need to ensure legal accountability for LARs and pledged support for a moratorium. The UK was the only state to oppose a moratorium. Did the UK really consider existing law to be sufficient to deal with fully autonomous weapons, and was it completely dismissing the idea of national moratoriums on the development and deployment of LARs? What evaluation of the recommendations for an international moratorium, for transparency over weapons review processes, for discussion of the limits of international humanitarian law and international human rights law, and for engagement in international dialogue did the Government carry out in advance of the debate in Geneva two weeks ago? I believe the UK should take a leading role in limiting the use of LARs, and use our considerable standing on the world stage to bring nations together to negotiate.
The UN report recommended a “collective pause”—time to reflect and examine the situation with open eyes, before the demands of an arms race, and of heavy investment in the technology, make such a pause impossible. Only with multilateral co-operation can an effective moratorium be achieved. As Christof Heyns observes, if nothing is done,
“the matter will, quite literally, be taken out of human hands.”
Turning to the UK’s own policy, in answers given by Lord Astor in the other place and a Ministry of Defence note, the UK has stated that
“the operation of weapons systems will always—always—be under human control”—[Official Report, House of Lords, 26 March 2013; Vol. 744, c. 960.]
and that
“no planned offensive systems are to have the capability to prosecute targets without involving a human.”—[Official Report, House of Lords, 7 March 2013; Vol. 743, c. WA411.]
This could form the positive basis of a strong policy, but further clarification and explanation are urgently required, and there has been no mention of a moratorium.
In November 2012 the USA outlined its policy and committed itself to a five-year moratorium. In a Department of Defence directive, the United States embarked on an important process of self-regulation regarding LARs, recognising the need for domestic control of their production and deployment, and imposing a form of moratorium. The directive provides that autonomous weapons
“shall be designed to allow commanders and operators to exercise appropriate levels of human judgement over the use of force.”
Specific levels of official approval for the development and fielding of different forms of robots are identified. In particular, the directive bans the development and fielding of LARs unless certain procedures are followed. The UN report notes that this important initiative by a major potential LARs producer should be commended and that it may open up opportunities for mobilising international support for national moratoriums.
During a Westminster Hall debate on l1 December 2012, my hon. Friend the Member for North Durham (Mr Jones), the Opposition Defence spokesman, expressed support for the move by the United States to codify the use of UAVs. He suggested that the UK examines whether it should, in addition to existing law, have a code covering: the contexts and limitations of usage; the process for internal Government oversight of deployments; command and control structures; and acceptable levels of automation. The Minister who responded to the debate rejected that suggestion on the grounds of operational security—this may be one of the big stumbling blocks.
However, now that we are talking about the development of LARs, we do need greater clarity, both in respect of UK policy and on the international stage. Existing international humanitarian law and international human rights law never envisaged weapons making autonomous decisions to kill. Deciding what to do about LARs is not like simply banning a chemical agent—it is far more complex than that. We are talking about technological know-how that can be used in so many different ways, so we need to sit down with other countries to look at the limitations of international humanitarian law and international human rights law.
The basis of the Government’s argument, made by me and by my noble Friend in the other place, is that the system of law and conventions that govern the development of weapons would prevent anyone from developing the weapon in such a manner as the hon. Member for Llanelli has suggested. It would not fit export criteria, so I do not think that we are at odds on that. The issue is whether the legal framework is sufficiently robust to prevent that. The United Kingdom, having made its own decision that it is not developing these weapons, believes that the basis of the legal system on weaponry is such as to prevent that development.
Will the Minister explain the distinction that he makes? In a meeting held in this place, one of the noble Lords with great experience in the Navy gave an example of a weapon that is used now which, once the parameters have been set, would work entirely automatically without any human intervention. What is the difference between that and the prospect of fully autonomous weapons?
My understanding, having discussed this with officials, is that it is the setting of the parameters that is the human element. For example, once the parameters were set of some existing weapons system that would seek to identify and defend itself against missiles coming at one of our ships in a situation of conflict, plainly an operator would not be needed to press the button each second to fire off the missiles—the system will do that automatically. That is an automatic system where the parameters have been set. What is envisaged through lethal autonomous robotics is a step beyond that, which no one has reached. To use the definition that the hon. Member for Llanelli gave right at the beginning and which I cited, that would be weapons systems which, once activated, could select and engage targets without any further human intervention. Those are not drones; it is a step beyond.
The hon. Lady has rightly observed that this is a complicated area, where further international discussion would help to clarify the legal and political implications of the possible future development of this technology. Like others, we think that the Human Rights Council is not the right forum for the discussion, but we stand ready to participate in the international debate and we agree that the convention on certain conventional weapons seems the right place for this important issue.
The hon. Lady asked why the UK was the only country to resist the call for a moratorium. I have set out our willingness to adopt a more restrictive policy than the legal freedom afforded, and our commitment to uphold international humanitarian law and to encourage others to do the same. I do not believe that our approach is so different from that of the United States and our European allies.
We did not interpret the discussion in Geneva in quite the same way as the hon. Lady. We believe that French and US attitudes are very similar to our own. Although some states spoke in favour of some sort of regulation or control, many did not, and we should not take that as universal support for a moratorium, given the number of states that did not express a view. Our sense is that support for a moratorium is far less than indicated by the hon. Lady. That does not in any way negate the concerns, but we are not quite sure that people are where she suggests they are in relation to a moratorium.
The law of armed conflict already addresses the ethical and moral aspects of these weapons systems to ensure adherence to principles of discrimination, proportionality, military necessity and humanity to protect people from unnecessary suffering. The selection and prosecution of all targets is always based on rigorous scrutiny which complies with international humanitarian law, UK rules of engagement and targeting policy.
The hon. Lady also asked me to elaborate on what the Government mean by human control and what level of human control they believe is sufficient, which is also the point behind the question asked by the hon. Member for Newport West (Paul Flynn). Targets will always be positively identified as legitimate military objectives with an appropriate level of command authority and control in their selection and prosecution. The UK is legally obliged to ensure that all weapons and associated equipment that it obtains or plans to acquire or develop comply with the UK’s treaty and other obligations in accordance with international humanitarian law. We do this through legal weapons review. For equipment to be procured, it must satisfy those key legal principles. The policy on the necessity, responsibility and conduct of article 36 reviews will be placed in the Library of the House.
International humanitarian law was designed to withstand future changes in technology. Although we have been discussing matters that are still far beyond the present technology, we believe that the legal system has in mind such future developments. We encourage all states to meet their obligations under international humanitarian law. We believe that the development and use of weapons should always be fully compliant with international law, including the Geneva conventions. We are working closely with the Government of Switzerland and the International Committee of the Red Cross on an initiative to strengthen compliance with international humanitarian law, and one of our primary objectives for the arms trade treaty was that it should put compliance with international humanitarian law at the heart of Governments’ decisions about the legitimate arms trade. We have voiced, and will continue to voice, our concerns with those states that do not live up to their obligations.
As I mentioned earlier, the United Kingdom does not have fully autonomous weapon systems, and the Ministry of Defence’s science and technology programme does not fund research into fully autonomous weapons. No planned offensive systems are to have the capability to prosecute targets without involving a human in the decision-making process.
There are a number of areas where United Kingdom policy is currently more restrictive than the legal freedoms allowed. We consider that to be entirely prudent. However, we cannot predict the future; we cannot know now how this technology will develop. Given the challenging situations in which we expect our armed forces personnel to operate now and in the future, it would be wrong to deny them legitimate and effective capabilities that can help them to achieve their objectives as quickly and safely as possible. We have a responsibility to the people who protect us, and must therefore reserve the right to develop and use technology as it evolves in accordance with established international law. Our current position on the development of these weapons is very clear, and I thank the hon. Member for Llanelli for giving me this opportunity to explain that to the House.
Question put and agreed to.