(3 years, 9 months ago)
Lords ChamberMy Lords, I am delighted to follow on from the noble Baroness, Lady Chakrabarti, who always seems to be a great source of common sense on complex moral issues. I am similarly delighted to support the amendment in the name of my one-time boss, the noble Lord, Lord Browne of Ladyton. I will not seek to repeat his arguments as to why this amendment is important, but rather to complement his very strong justification with my own specific thoughts and nuances.
I will start with some general comments on the Bill, as this is my only contribution at this stage. At Second Reading I made my own views on this Bill quite clear. I felt that it missed the main issues regarding the challenges of Lawfare. Specifically, I felt that the better route to reducing the problem of vexatious claims was not through resort to legal exceptionalism, but rather rested on a series of more practical measures relating to such things as investigative capacity, quality and speed; better training; improved operational record keeping; more focused leadership, especially in the critical area of command oversight; and a greater duty of care by the chain of command. On this latter, I wholly support the amendment of my noble friend Lord Dannatt.
Having listened to the arguments deployed in Committee, I am struck by the seeming inability of even this sophisticated Chamber to reach a common view as to whether the many provisions of this Bill offer enhanced protections or increased perils for our servicemen and women. This causes me grave concern. How much more likely is it that our servicemen and women—those whose primary desire is to operate within the law—will be confused; and how much more likely is it that are our enemies—those who want to exploit the law for mischief—will be encouraged?
I hold to the view that the law, in any formulation, cannot be fashioned into a weapon of decisive advantage in our bid to rid our people of vexatious claims. Rather, the law will increasingly be exploited by our enemies as a vector of attack, both to frustrate our ability to use appropriate force and to find novel ways of accusing our servicemen and women of committing illegal acts. The solution to this problem is a mixture of functional palliatives and better legal preparedness. This amendment addresses one element of this preparedness.
As we have already heard, one area of new legal challenge will undoubtedly be in the realm of novel technologies, particularly those which employ both artificial intelligence and machine learning to give bounded autonomy to unmanned platforms, which in turn have the ability to employ lethal force. We are currently awaiting the imminent outcome of the integrated review, and we understand that a defence command paper will herald a new era of technological investment and advancement: one that will enable a significant reduction in manned platforms as technology permits elements of conflict to be subordinated to intelligent drones and armed autonomous platforms.
However—and this is the basic argument for this amendment—the personal liability for action in conflict to be legal will not cease, although it may become considerably more opaque. We must therefore ask whether we have yet assessed the moral, legal, ethical and alliance framework and protocols within which these new systems will operate. Have we yet considered and agreed the command and control relationships, authorities and delegations on which will rest the legal accountability for much new operational activity?
Personally, I have a separate and deep-seated concern that a fascination with what is technically feasible is being deployed by the Government, consciously or unconsciously, primarily as the latest alchemy by which defence can be made affordable. It is being deployed without properly understanding whether its true utility will survive the moral and legal context in which it will have to operate. I therefore offer my full support to this amendment, in the hope that it will assist us in getting ahead of the problem. The alternative is suddenly waking up to the fact that we have created Armed Forces that are both exquisite and unusable in equal measure.
My Lords, I thank my noble friend Lord Browne, the noble Lord, Lord Clement-Jones, and the noble and gallant Lord, Lord Houghton, for bringing forward this important amendment and debate. I understand my noble friend Lord Browne’s concerns about the mismatch between the future-focused integrated review, which has had long delays but will be hopefully published next week, and the legislation we have in front of us.
Technology is not only changing the kinds of threats we face but changing warfare and overseas operations in general. In Committee in the other place, Clive Baldwin of Human Rights Watch neatly summed this up by suggesting that
“we are seeing a breakdown in what is the beginning and the end of an armed conflict, what is the battlefield and what decisions are made in which country … The artificial distinction of an overseas operation with a clear beginning, a clear theatre and a clear end is one that is very much breaking down.”—[Official Report, Commons, Overseas Operations (Service Personnel and Veterans) Bill Committee, 6/10/20; col. 67.]
How is this reflected in the Bill?
When the Prime Minister gave his speech on the integrated review last year, he rightly said that “technologies …will revolutionise warfare” and announced a new centre dedicated to AI and an RAF fighter system that will harness AI and drone technology. This sounds impressive but, as my noble friend Lord Browne said, as military equipment gets upgraded, we do not know how the Government plan to upgrade legal frameworks for warfare and what this means in terms of legal protection for our troops.
We must absolutely tackle vexatious claims and stop the cycle of reinvestigations, but how will claims against drone operators or personnel operating new technology be handled? Do those service personnel who operate UAVs not deserve to be protected? And how will legal jeopardy for our troops be avoided?
As new technology develops, so too must our domestic and international frameworks. The final report of the US National Security Commission on Artificial Intelligence stated that the US commitment to international humanitarian law
“is longstanding, and AI-enabled and autonomous weapon systems will not change this commitment.”
Do the Government believe the same?
I would also like to highlight the serious impact on troops who might not be overseas, but who are operating drones abroad. A former drone pilot told the Daily Mirror:
“The days are long and hard and can be mentally exhausting. And although UAV pilots are detached from the real battle, it can still be traumatic, especially if you are conducting after-action surveillance.”
The RUSI research fellow Justin Bronk also said that, as drone operators switched daily between potentially lethal operations and family life, this could be extremely draining and psychologically taxing. What mental health and pastoral support is given to these troops currently? Drone operators may not be physically overseas, but they are very much taking part in overseas operations. With unmanned warfare more common in future conflicts, I would argue that failing to include those operations in the Bill may cause service personnel issues down the line.
I would like to hear from the Minister how this legislation will keep up to date with how overseas operations operate, and whether she is supportive of a review along the lines of Amendment 32—and, if not, why not?