Monday 17th June 2013

(11 years, 6 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Nia Griffith Portrait Nia Griffith
- Hansard - - - Excerpts

My hon. Friend is right. There will be a huge imbalance between countries that have these technologies and the potential to use them, and countries that do not.

LARs increase the distance, physical and emotional, between weapons users and the lethal force that they inflict. Drones already offer the states that deploy them the military advantage of being able to carry out operations without endangering their own military personnel, and thus distance the operators from the action. LARs would take that a crucial step further by lessening the weight of responsibility felt by humans when they make the decision to kill. They could lead to a vacuum of moral responsibility for such decisions.

Secondly, LARs give rise to legal issues. Given that they would be activated by a human being but no human being would make the specific decision to deploy lethal force, it is fundamentally unclear who would bear legal responsibility for the actions performed by them. If the legal issues are not tackled, an accountability vacuum could be created, granting impunity for all LARs users. Furthermore, robots may never be able to meet the requirements of international humanitarian law, as its rules of distinction and proportionality require the distinctively human ability to understand context and to make subjective estimates of value. The open-endedness of the rule of proportionality in particular, combined with complex circumstances on a battlefield, could result in undesired and unexpected behaviour by LARs. It is clear that existing law was not written to deal with LARs.

Thirdly comes a multitude of terrifying practical concerns. The lowered human cost of war to states with LARs, as my hon. Friend the Member for Newport West (Paul Flynn) pointed out, could lead to the “normalisation” of armed conflict. A state with LARs could choose to pit deadly robots against human soldiers on foot, presenting the ultimate asymmetrical situation. States could be faced with the temptation of using LARs outside of armed conflict, finding themselves able to eliminate perceived “troublemakers” anywhere in the world at the touch of a button. LARs could be hacked or appropriated, possibly for use against the state, and they could malfunction, with deadly consequences.

This report corroborates the revolutionary difference between LARs and any previous weapons system, and proves the following: that our current understanding of the nature of war cannot support them; that our existing legislation cannot regulate them; and that we cannot predict the effects that they may have on our future world.

What is called for worldwide in response is both an urgent course of action, and a mutual commitment to inaction: immediate action to ensure transparency, accountability and the rule of law are maintained; and agreement to inaction in the form of a global moratorium on the testing, production, assembly, transfer, acquisition, deployment and use of LARs until an international consensus on appropriate regulations can be reached.

Will the Minister explain the Government’s position on the recommendations in the UN report. It calls on all states to do the following: put in place a national moratorium on lethal autonomous robotics; participate in international debate on lethal autonomous robotics, and in particular to co-operate with a proposed high level panel to be convened by the UN High Commissioner for Human Rights; commit to being transparent about internal weapons review processes; and declare a commitment to abide by international humanitarian law and international human rights law in all activities surrounding robotic weapons.

At the UN Human Rights Council in Geneva, a large number of states expressed the need to ensure legal accountability for LARs and pledged support for a moratorium. The UK was the only state to oppose a moratorium. Did the UK really consider existing law to be sufficient to deal with fully autonomous weapons, and was it completely dismissing the idea of national moratoriums on the development and deployment of LARs? What evaluation of the recommendations for an international moratorium, for transparency over weapons review processes, for discussion of the limits of international humanitarian law and international human rights law, and for engagement in international dialogue did the Government carry out in advance of the debate in Geneva two weeks ago? I believe the UK should take a leading role in limiting the use of LARs, and use our considerable standing on the world stage to bring nations together to negotiate.

The UN report recommended a “collective pause”—time to reflect and examine the situation with open eyes, before the demands of an arms race, and of heavy investment in the technology, make such a pause impossible. Only with multilateral co-operation can an effective moratorium be achieved. As Christof Heyns observes, if nothing is done,

“the matter will, quite literally, be taken out of human hands.”

Turning to the UK’s own policy, in answers given by Lord Astor in the other place and a Ministry of Defence note, the UK has stated that

“the operation of weapons systems will always—always—be under human control”—[Official Report, House of Lords, 26 March 2013; Vol. 744, c. 960.]

and that

“no planned offensive systems are to have the capability to prosecute targets without involving a human.”—[Official Report, House of Lords, 7 March 2013; Vol. 743, c. WA411.]

This could form the positive basis of a strong policy, but further clarification and explanation are urgently required, and there has been no mention of a moratorium.

In November 2012 the USA outlined its policy and committed itself to a five-year moratorium. In a Department of Defence directive, the United States embarked on an important process of self-regulation regarding LARs, recognising the need for domestic control of their production and deployment, and imposing a form of moratorium. The directive provides that autonomous weapons

“shall be designed to allow commanders and operators to exercise appropriate levels of human judgement over the use of force.”

Specific levels of official approval for the development and fielding of different forms of robots are identified. In particular, the directive bans the development and fielding of LARs unless certain procedures are followed. The UN report notes that this important initiative by a major potential LARs producer should be commended and that it may open up opportunities for mobilising international support for national moratoriums.

During a Westminster Hall debate on l1 December 2012, my hon. Friend the Member for North Durham (Mr Jones), the Opposition Defence spokesman, expressed support for the move by the United States to codify the use of UAVs. He suggested that the UK examines whether it should, in addition to existing law, have a code covering: the contexts and limitations of usage; the process for internal Government oversight of deployments; command and control structures; and acceptable levels of automation. The Minister who responded to the debate rejected that suggestion on the grounds of operational security—this may be one of the big stumbling blocks.

However, now that we are talking about the development of LARs, we do need greater clarity, both in respect of UK policy and on the international stage. Existing international humanitarian law and international human rights law never envisaged weapons making autonomous decisions to kill. Deciding what to do about LARs is not like simply banning a chemical agent—it is far more complex than that. We are talking about technological know-how that can be used in so many different ways, so we need to sit down with other countries to look at the limitations of international humanitarian law and international human rights law.

Tessa Munt Portrait Tessa Munt (Wells) (LD)
- Hansard - -

I have listened to what the hon. Lady has said. Does she agree that the only way forward is an explicit ban, because international humanitarian law was, as she said, written before anyone could contemplate fully autonomous weapons? Does she agree that the most important thing is for human beings to make morally based decisions to stay within the law and the only way forward is a full ban?

Nia Griffith Portrait Nia Griffith
- Hansard - - - Excerpts

The hon. Lady makes a very valid point, which shows why the negotiations are so crucial. We need to define exactly what is meant by LARs and examine that international law to see what we can do to regulate all the appropriate weapons. If we are to make progress on banning LARs, nations need to be clear about exactly what we mean.

We then need to look at what mechanisms could be used. One suggestion would be to use the convention on certain conventional weapons, the mechanism used to make an international legal agreement on a pre-emptive ban on blinding lasers. Another option would be to use the process that led to 107 states adopting the convention on cluster munitions, five years ago last month. That treaty was groundbreaking for three main reasons: first, it banned an entire category of weapons; secondly, it brought a ban into existence before the use of cluster munitions had become widespread; and, thirdly, the treaty process was multilateral, shaped through the initiative and sustained leadership of the Norwegian Government, with a strong partnership between states and organisations working together towards a clear common goal.

The UK needs to be at the forefront of the debate on LARs. Now is the time for further international discussion. Now is the time to encourage a wide range of states to adopt a moratorium on the development and deployment of LARs until a new international legal framework has been developed that takes account of the potential of LARs and lays the basis for discussion on their future regulation or prohibition. I very much hope that we will see the UK taking a lead on this matter.

Alistair Burt Portrait The Parliamentary Under-Secretary of State for Foreign and Commonwealth Affairs (Alistair Burt)
- Hansard - - - Excerpts

I thank the hon. Member for Llanelli (Nia Griffith), not only for bringing a very serious matter to the House and explaining it clearly, but for her immense courtesy this afternoon in sending us a copy of her speech, which enabled me to discuss it with officials and therefore answer the four key questions that she has raised.

I thank the hon. Lady for bringing the issue of lethal autonomous robotics before Parliament. It is clear from this debate and the one recently at the UN Human Rights Council in Geneva that this is an important subject which will inevitably become ever more so as technology develops. Let me clarify the scope of today’s debate. I agree with her that LARs are weapon systems which, once activated, can select and engage targets without any further human intervention. Her definition was correct and it is clearly one step on from drones, which have a human component—I will come back to discuss that in a moment.

Let me be very clear and back up the comments made by my noble Friend Lord Astor in the other place and quoted by the hon. Lady. He stated that

“the operation of weapons systems will always…be under human control”—[Official Report, House of Lords, 26 March 2013; Vol. 744, c. 960.]

and that

“no planned offensive systems are to have the capability to prosecute targets without involving a human”.—[Official Report, House of Lords, 7 March 2013; Vol. 743, c. WA411.]

Let me reiterate that the Government of the United Kingdom do not possess fully autonomous weapon systems and have no intention of developing them. Such systems are not yet in existence and are not likely to be for many years, if at all. Although a limited number of defensive systems can currently operate in automatic mode, there is always a person involved in setting the parameters of any such mode. As a matter of policy, Her Majesty’s Government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.

By putting that information on the record I hope to make it clear that we share the concern that the hon. Lady has brought before the House, which others share, about possible technological developments. My argument is that the UK believes that the basis of international law governing weapons systems would prevent the development of weapons in the way that she suggests, but whether or not that is the case, the UK’s position on wishing to develop such weapons is absolutely clear.

The United Kingdom always acts fully in accordance with international humanitarian law and international standards. We are committed to upholding the Geneva conventions and their additional protocols and encourage others to do the same. We always ensure that our military equipment is used appropriately and is subject to stringent rules of engagement. I shall discuss that in more detail later.

I thank the hon. Lady for her summary of the report presented by Christof Heyns, the special rapporteur on extrajudicial, summary or arbitrary executions, which was discussed in Geneva on 30 May. Let me summarise the report. Mr Heyns highlighted that the “possible” use of lethal autonomous robotics raises far-reaching concerns about the protection of life during war and peace. In his findings, he recommended that states establish national moratoriums on aspects of lethal autonomous robotics and called for the establishment of a high-level panel to produce a policy for the international community on the issue.

The hon. Lady asked whether the Government were willing to accept the four recommendations made in the report. I believe the point she particularly wanted to discuss was the question of why, as she said, the UK was the only state that did not support a moratorium. Let me make things a little clearer, if I may. The UK has unilaterally decided to put in place a restrictive policy whereby we have no plans at present to develop lethal autonomous robotics, but we do not intend to formalise that in a national moratorium. We believe that any system, regardless of its level of autonomy, should only ever be developed or used in accordance with international humanitarian law. We think the Geneva conventions and additional protocols provide a sufficiently robust framework to regulate the development and use of these weapon systems.

As I had the chance to read the hon. Lady’s speech before the debate, I noticed that she used the phrase “Furthermore, robots may never be able to meet the requirements of international humanitarian law”. She is absolutely correct; they will not. We cannot develop systems that would breach international humanitarian law, which is why we are not engaged in the development of such systems and why we believe that the existing systems of international law should prevent their development.

Tessa Munt Portrait Tessa Munt
- Hansard - -

What is in place to protect against the development of such weapons systems by UK or UK-based companies, whether that is for export or to be taken to another destination, not to be used by us?

Alistair Burt Portrait Alistair Burt
- Hansard - - - Excerpts

The basis of the Government’s argument, made by me and by my noble Friend in the other place, is that the system of law and conventions that govern the development of weapons would prevent anyone from developing the weapon in such a manner as the hon. Member for Llanelli has suggested. It would not fit export criteria, so I do not think that we are at odds on that. The issue is whether the legal framework is sufficiently robust to prevent that. The United Kingdom, having made its own decision that it is not developing these weapons, believes that the basis of the legal system on weaponry is such as to prevent that development.