Armed Forces Bill Debate

Full Debate: Read Full Debate
Department: Ministry of Defence
Moved by
59: After Clause 18, insert the following new Clause—
“Liability for using novel technologies: review
(1) Within three months of this Act being passed, the Secretary of State must commission a review of the implications of increasing autonomy associated with the use of artificial intelligence and machine learning, including in weapons systems, for legal proceedings against armed forces personnel that arise from overseas operations, and produce recommendations for favourable legal environments for the United Kingdom’s armed forces operating overseas, including instilling domestic processes and engaging in the shaping of international agreements and institutions.(2) The review must consider—(a) what protection and guidance armed forces personnel need to minimise the risk of legal proceedings being brought against them which relate to overseas operations in response to novel technologies,(b) how international and domestic legal frameworks governing overseas operations need to be updated in response to novel technologies, and(c) what novel technologies could emerge from the Ministry of Defence and the United Kingdom’s allies, and from the private sector, which could be used in overseas operations.(3) Within the period of one year beginning on the day on which the review is commissioned, the Secretary of State must lay a report before Parliament of its findings and recommendations.”
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, this amendment is also in the names of the noble Lord, Lord Clement-Jones, and the noble and gallant Lords, Lord Houghton of Richmond and Lord Craig of Radley. I am very grateful to them for joining me in this amendment, and I convey the apologies of the noble Lord, Lord Clement-Jones, who is unable to be present today because he had a prior, immovable commitment to be abroad representing your Lordships’ House in a meeting.

Amendment 59 focuses on the protection and guidance that Armed Forces personnel engaged in the deployment and use of new technologies will need to ensure that they comply with the law, including international humanitarian law, and that will explain how international and domestic legal frameworks need to be updated—all because of the predicted increased use of novel technologies that could emerge from or be deployed by the Ministry of Defence, UK allies or the private sector.

Today the private sector is often deployed with our Armed Forces on overseas operations as part of a multinational force. The amendment imposes an obligation on the Secretary of State to commission a review of the relevant issues, sets out what that review must consider and obliges the Secretary of State to lay a report before Parliament of the report’s findings and recommendations.

That is the focus of the amendment but underlying it is a much broader issue about the duties of the Government for our Armed Forces in respect of the development, deployment and use of these technologies, and another complementary obligation on the Government to ensure that they are parliamentarily accountable for these developments—to the extent, of course, that they can be.

Noble Lords will recall that the same amendment was tabled and debated during the passage of the overseas operations Bill but was not pressed to a vote. Separately, on behalf of those noble Lords who supported it, I told the Minister that it was our intention to bring it back in this context, which is perhaps a more appropriate and broader context for the amendment.

I thank the Minister and pay tribute to her and to the MoD officials who are wrestling with the complex legal challenges posed by the development and deployment of these weapons systems for their work on that, and for their repeated engagement with me and other noble and noble and gallant Lords, including those who have put their names to this amendment. As a result of that engagement, I am very aware that the Ministry of Defence continues, and has continued over recent months at pace, both domestically and internationally, to work hard on this, and is making progress with these complex challenges.

I do not want to take unnecessary time going over again all the arguments made in support of the measure in the overseas operations Bill context. I take them as read. There are still unanswered questions, but I hope that, over time, they may be answered. I shall refer to some of them, and more recent developments, for another purpose, which is to set the context, and reinforce the importance, of addressing these challenges—so I shall repeat a few points that I made in earlier debates.

First, the integrated review, published in March, was the third defence and security review since 2020, which alone is an indication of the pace at which these developments are taking place. It was described as forward-facing, recognising both current and future threats against the UK, and set out the capabilities that will need to be developed to deter and engage them. It does do that—imperfectly, I have to say, but it does do it.

When the Prime Minister made a Statement on the review in November last year, he said that

“now is the right time to press ahead”

with the modernisation of the Armed Forces because of

“emerging technologies, visible on the horizon”.—[Official Report, Commons, 19/11/20; col. 488.]

The Prime Minister said that these would “revolutionise warfare” and I think he was right. The CGS, General Sir Mark Carleton-Smith, said that he foresees the army of the future as

“the integration of boots and bots”.

The noble and gallant Lord, Lord Houghton of Richmond, who is with us today, has repeatedly warned your Lordships about the risks posed by the intersection of artificial intelligence and human judgment and has spoken wisely about the risks posed by technology interacting with human error.

These risks are with us now and they are very real. Last month retired General Stanley McChrystal, who led the coalition forces in Afghanistan for two years, said that artificial intelligence inevitably will come to make lethal decisions on the battlefield. However, he acknowledged the “frightening” risks of potential malfunction or mistake. He said:

“People say, ‘We’ll never give control over lethal strike to artificial intelligence.’ That’s wrong. We absolutely will. Because at a certain point, you can’t respond fast enough, unless you do that. A hypervelocity missile, hypersonic missile coming at the United States aircraft carrier, you don’t have time for individuals to do the tracking, you don’t have time for senior leaders to be in the decision loop, or you won’t be able to engage the missile.”


Now, at a less strategic level, military-grade autonomous drones can fly themselves to a specific location, pick their own targets and kill without the assistance of a remote human operator. A UN report about a March 2020 skirmish in the military conflict in Libya records that such a drone made its wartime debut. The report states that retreating forces

“were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles”,

but does not say explicitly that this lethal autonomous weapon system killed anyone. But it certainly tried to.

The very real fear is that autonomous weapons will undermine the international laws of war. These laws are premised on the idea that people can be held accountable for their actions even during wartime and that the right to kill during combat does not confer the right to murder civilians. But how can autonomous weapons be held accountable? Who is to blame for a robot that commits war crimes? Who would be put on trial: the weapon, the soldier, the soldier’s commanders, the corporation that made the weapon, or the person who wrote the code that gave the weapon the ability to do this?

In a world without regulations that compel meaningful human control of autonomous weapons, there will be war crimes with no war criminals to hold accountable, and the laws of war, along with their deterrent value, will be weakened significantly. I say “deterrent value” because I think, from my experience, that the laws of war and international humanitarian laws work because they are observed, not because they are enforced. It is important that we find some way of collectively reviewing these laws so that they can continue to be observed in this more complicated—and, in many ways, terrifying—new world that we are moving rapidly into.

On 21 October 2021, NATO Defence Ministers agreed to NATO’s first ever strategy for artificial intelligence—AI—which states:

“At the forefront of this Strategy lie the NATO Principles of Responsible Use for AI in Defence, which will help steer our transatlantic efforts in accordance with our values, norms, and international law. The NATO Principles of Responsible Use … are based on existing and widely accepted ethical, legal, and policy commitments under which NATO has historically operated and will continue to operate under. These Principles do not affect or supersede existing obligations and commitments, both national and international.”


Our Government must have agreed these principles. When will the Minister make a Statement to Parliament on them, allow them to be debated and allow Ministers to be questioned on their sufficiency or their breadth and depth? The provisions of Article 36 of Protocol 1, additional to the 1949 Geneva conventions, commit states, including our own, to ensure the legality of all new weapons, means and methods of warfare by subjecting them to a rigorous and multidisciplinary review. I have no reason to believe that we have not complied with our legal obligations in that respect, but, unfortunately, as we are not one of the eight nations in the world that publish a review of legal compatibility, including the United States of America, I have no Minister’s reassurance in that regard. When will we get that assurance or transparency?

--- Later in debate ---
Baroness Goldie Portrait Baroness Goldie (Con)
- Hansard - - - Excerpts

My Lords, I have added to my choreography before standing at the Dispatch Box: can I get a Polo mint in before the noble Lord, Lord Coaker, concludes? The answer is no. That is the first question I am able to answer.

I thank the noble Lord, Lord Browne, for tabling Amendment 59, which is supported by the noble Lord, Lord Clement-Jones, and the noble and gallant Lords, Lord Houghton and Lord Craig, and engages with the subject of novel technologies. It is a significant issue that merits discussion, and I am grateful to the noble Lord for his kind remarks.

There is no doubt that the increasing adoption of innovative technologies is changing how military operations are conducted. The noble Lords’ analysis—that we need to be particularly mindful of the legal ramifications—is hard to dispute. From the engagement that I and the department have had with the noble Lords, I know that they understand very well the broader complexities likely to be created by Defence use of AI and are anxious that we should address these issues both purposefully and systematically. This scrutiny and challenge is welcome, because we are grappling with questions and subjects that are indeed very complex.

I hope to reassure your Lordships that the department is alert to these issues and has worked extensively on them over the course of the last 18 months. Noble Lords will understand that I cannot set out details until these positions have been finalised, but work to set a clear direction of travel for defence AI, underpinned by proper policy and governance frameworks, has reached an advanced stage. Key to this is the defence AI strategy, which we hope to publish in the coming months, along with details of the approaches we will use when adopting and using AI. This commitment, which is included in the National AI Strategy, reflects the Government’s broader commitment that the public sector should set an example through how it governs its own use of the technology. Taken together, we intend that these various publications will give a much clearer picture than is currently available, because we recognise that these are important issues that attract a great deal of interest, and we need to be as transparent and engaged as possible.

Noble Lords asked pertinent questions. I think the noble and gallant Lord, Lord Craig, asked some of these: where in the chain of command does responsibility for AI-related outcomes reside? When might the Government have an obligation to use AI to protect service personnel from harm? What are the military and moral consequences of machine-speed warfare? These are vital questions, and we recognise that we do not yet have all the answers.

Nor can we hope to arrive at these answers on our own. We have to persist in our engagement with our international partners and allies, and with our own public and civil society. It is perfectly legitimate for parliamentarians to take an interest in this subject, to ask questions and to table debates. I hope that our forthcoming publications will provide a solid platform for an ongoing effort of public engagement and efforts to enhance public understanding, subject to the usual caveats that may apply to the release of Defence information.

To turn to the subject of the proposed amendment, we are committed to ensuring that our Armed Forces personnel have the best possible care and protection, including protection against spurious legal challenges. I assure noble Lords that, regardless of the technologies employed, all new military capabilities are subject to a rigorous review process for compliance with international humanitarian law. Furthermore, we also adjust our operating procedures to ensure that we stay within the boundaries of the law that applies at the time.

International and domestic frameworks provide the same level of protection around the use of novel technologies as for conventional systems because their general principle is to focus on the action, rather than the tool. These frameworks therefore offer appropriate levels of protection for our personnel. Earlier this year, we acted to bolster this protection in historical cases, for example, through the overseas operations Act.

In respect of artificial intelligence, I have mentioned our forthcoming AI strategy and our plan to publish details of the approaches we will use when adopting and using AI. This is really where we come to the nub of the issue. The noble Lord, Lord Browne, put his finger on it, as did the noble and gallant Lord, Lord Houghton, and the noble Lord, Lord Coaker. I want to try to encapsulate what I hope will be a substantive and reassuring response to them all.

These approaches will not affect or supersede existing legal obligations, but they will ensure coherence across defence. They will also drive the creation of the policy frameworks and systems that, in practical terms, are needed to ensure that personnel researching, developing, delivering and operating AI-enabled systems have an appropriate understanding of those systems and can work with and alongside them in compliance with our various legal and policy frameworks.

The noble Lord, Lord Browne, specifically referred to the NATO AI principles. Essentially, NATO’s position is that alliance members can sign up to these NATO-wide standards or they can produce their own to a similar standard. We support NATO’s leadership in the responsible use of artificial intelligence and, as I have indicated, we intend to publish details of our own approach in early course.

In addition, we will continue to engage internationally, including through the United Nations Conference on Certain Conventional Weapons, to promote consensus on international norms and standards for the use of new and emerging technologies on the battlefield, while continuing to act as a responsible leader in this area.

I think it was the noble Baroness, Lady Smith, who asked about the phrasing I used in response to her noble friend Lord Clement-Jones’s question last week. From memory, I said two things: first, the UK has no systems that could unilaterally employ lethal force without human involvement at some stage in the process. I think that I went on to say that, sharing the concerns of government, civil society and AI experts around the world, the UK opposes the creation and use of systems that would operate without context-appropriate human involvement. I think that is the phrase the noble Baroness sought clarification on.

The phrase means that a person is exercising some form of control over the effect of the use of the weapon in a way that satisfies international humanitarian law. This could be some form of control over the operation in real time, or it could be setting clear operational parameters for a system. I hope that that has been helpful to the noble Baroness in explaining what was behind the use of that phrase.

I have endeavoured to provide reassurance to noble Lords that the Ministry of Defence takes these matters very seriously, is already doing all that needs to be done, and is planning to be proactive in communicating our approach appropriately to Parliament and the public. On this basis, I suggest that the amendment is not needed.

I also say, with the greatest respect to the noble Lord, Lord Browne, and no sense of impertinence, that I do question the utility of requiring a review and a report. This will necessarily be only a snapshot; it will quickly become out of date when we are dealing with a rapidly evolving subject matter. Not to put too fine a point on it, the effort of staffing it risks reducing the capacity needed within the department for developing the extensive systems and frameworks that we need to ensure the proper handling of AI.

I must say that I have enjoyed this debate, as I always enjoy my engagement with the noble Lord, Lord Browne—but, for these reasons, I ask that he withdraw his amendment.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

I thank the Minister for her response to this debate and, with the indulgence of the Committee, I will refer to parts of her response. I was greatly appreciative of it all, but some parts I welcomed more than others.

I will start with the last point. The criticisms the Minister made about the vehicle that I tabled in order to have this debate was correct. It is implicit in the way I debate these issues that they are moving so fast that probably there is no point in time at which we could publish a report that would not quickly go out of date. I accept that. In fact, for that reason I wish that people, and sometimes senior military officers—but thankfully no British ones—would stop talking about a “race” for this technology. A race requires a line, and the development of this technology has no winning line that we know of.

In fact, the likelihood is that when we move to AGI, which is a hypothetical but likely development, whereby an intelligent agent understands or learns any intellectual task that a human being can, it may well be that we think we are at the line, but the machine does not think we are at the line and runs on and looks back at us and laughs. So I accept all of that but, at some point, we need to find a framework in which we in Parliament can connect with these issues—a methodology for the Government to report to Parliament, to the extent that they can, and for all of us to take responsibility, as we should, for asking our young people to go into situations of conflict, with the possibility that these weapons will be used, with all the implications.

So that is what I am seeking to get. I want a 24 year-old who is asked to take some responsibility in an environment in which these weapons are deployed to know with confidence that he or she is acting within the law. That is my shared responsibility with the Government; we need to find a way of doing that. This may be an imperfect way, but we may always be in an imperfect situation with a moving target. So I thank all noble Lords for their contributions to this debate. None of these debates answers any questions fully, but they all add to our collective knowledge.

I thank the noble and gallant Lord, Lord Houghton, for his unqualified support. He took me slightly by surprise with the deployment of his eloquence to make the case for deploying the law as a weapon of war. I fear that I agree with him—I used to be a lawyer—but I will have to think long and carefully before I give him my unqualified support for that. However, I suspect that, as always, I will end up supporting what he said.

--- Later in debate ---
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, I shall speak to both amendments. I thank my noble friend Lady Massey of Darwen for tabling them. My noble friends Lady Massey and Lady Lister and I are doing our level best, in his absence, to do justice to our recently deceased, much-loved and greatly missed noble friend Lord Judd, who was a person of the greatest integrity and enormous kindness, in the context of an issue which was very dear to his heart. But that is not why I want to speak to these amendments.

When I was Secretary of State for Defence, I attended a passing-out parade for young recruits and, on occasions, spent time with the young recruits themselves and those who were training them. I invariably enjoyed a morning of meeting recruits, their families and the Army training and welfare staff. Among other matters, we talked about some of the social challenges that these young people faced. On each occasion—this was some time ago—I left with an overwhelming feeling that the Army offers many young people an accessible alternative at a time when some could quite easily drift down another path; a point which the noble Lord, Lord Lancaster, made repeatedly and which I think is not lost on your Lordships’ Committee.

Of course, the discussion was almost exclusively about how the Army had provided for these young people, often from very poor socioeconomic backgrounds, an opportunity to find meaning in their lives and to develop comradeship and interpersonal skills, as well as training them for a variety of trades—opportunities which may have been difficult for them to obtain otherwise. I admit all of that. I wish I had access then to the research I have now read because I would not have asked the young people these questions. I would have asked the people who were training them and responsible for them, and who had recruited them, many different questions. I now have access to this research, which I regret that the noble Lord, Lord Lancaster, dismisses with a wave of his hand, saying that it clearly is being done by people who have a vested interest—as he does, of course.

Frankly, I have much experience of personal experiences which have been contradicted by the truth. I would, in the face of this peer-reviewed research, not be conceited enough to make the case that my short experience, which has never been peer reviewed or tested properly, was a better basis for public policy than that research. That is the point I want to make in this debate.

My attention has been drawn to the work of King’s College, which found that violent, sexual and drug-related offending increases after enlistment and then rises again before first deployment. My attention has been drawn to two recent studies by the University of Glasgow—my alma mater and hardly an institution which has some grudge against the Army or its practice of recruiting young people, but which has, like King’s College, an enviable academic record and an insistence that before any work is published it is properly and rigorously peer reviewed—which found that the mental health outcomes of junior entrants give further cause for concern. The Glasgow study found that PTSD among veterans who enlisted before 1995 was between two and three times more common than among civilians from the same social background.

In the face of these recent reports, it is hardly surprising that many people are calling for an end to the UK’s policy of permitting 16 year-olds to join the military, but I am asking for an urgent rethink. I press this upon the Minister. I will not rehearse all the many good arguments as to why this reconsideration ought to conclude with a termination of the policy, but my conclusion is that the case for consideration of raising the minimum age is comprehensive. It is built on medical evidence, sound logic and, much more importantly, ethical standards.

Beyond those recruited to the Army, adolescence is known as a time when the brain and the ability to make well-reasoned decisions are still developing. Why would we ask young people to make a decision of this importance when their brain is still developing? Of course we ask young people to make all sorts of decisions that affect what they do in the rest of their life, but this is a very special decision because of what the Army does. It means that teenagers recruited to the Army are more likely to be acting on impulse than making a fully informed decision about their future. I say no more; I do not say that every one of them is but they are more likely to be. That is enough to make me hesitate. It means that they are also less likely, although it is not impossible, to withstand the physical and emotional strains of military life and training. Young people who have experienced childhood adversity are also more likely to develop mental health problems in the Army.

There is credible research on all of this. The noble Lord, Lord Lancaster, invites each of us to visit a particular institution. I invite him to read the research with an open mind. I will be confounded if he does not come to the conclusion that there is a serious issue. One study found that three-quarters of military personnel have suffered two or more instances of childhood adversity and that factors such as younger age, lower educational attainment and serving in the Army were all linked with higher vulnerability to depression and anxiety. I understand that that might be because of what we ask these people to do and what we subject them to in order to keep us secure. That is their service to us and it has consequences for them. We have to ask ourselves, however: at what point in their maturity is it more likely that they will make the right decision to commit their lives to do that? All I ask is that we consider what that time is.

There are, of course, logical flaws in the policy of 16 year-olds joining the Army. It is inconsistent with other legal age limits. Supposedly 16 year-olds are not mature enough to vote but they still can make life-changing decisions about their future. They cannot purchase knives but they can learn to use lethal weapons. Perhaps the greatest irony is that the sale of certain military videogames is prohibited to under-18s. That is not at the heart of my argument, but there are these inconsistencies. This is not the only case where an age limit that we apply to activity appears arbitrary and illogical.

In answer to the question from the noble Lord, Lord Lancaster, about what age we should choose: any age we choose is arbitrary because each of these young people—these children—is an individual. If we could find some way to measure their maturity and their ability to go through what they will go through, that would be a far greater way to decide whether they were ready to be recruited to the Army, but we cannot. It was tried and it proved to be ineffective.

Surely, if we are satisfied, on the incontrovertible evidence, that it is far less likely that we will expose young people who are actually not fit for this if we wait until they are 18 instead of doing it when they are 16, that is a very compelling reason for moving the age from 16 to 18. I am not suggesting that those arguments ought to convince the Government to go back on this policy; there are many others. But surely the time has come, now that we have this knowledge, to do what noble Lords in this Committee have repeatedly asked the Minister to do—to expand on the research until we can make the best judgment we can with what we have available to us about this. The preponderance of the evidence suggests that it should be to stop recruiting young people at scale into the Army at 16 years of age.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

I apologise, but I had not finished—it was a dramatic pregnant pause that misled the noble Baroness.

Baroness Goldie Portrait Baroness Goldie (Con)
- Hansard - - - Excerpts

Your preface is a long one.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

It is not a preface. I want to turn to Amendment 62, for a couple of paragraphs. The amendment would ensure that

“soldiers aged under 18 are not required to serve for a longer period than adult personnel.”

In my view, the amendment addresses an issue that is just wrong—we just should not be keeping people who signed at 16 in the Army longer than people who signed at 18, just because of their age. There is no justification for that discrimination, in my view. It is an abuse of their rights; they should be treated the same as everybody else, and we should simply get rid of their distinction. I have finished now.

--- Later in debate ---
In conclusion, our Armed Forces provide challenging and constructive education, training and employment opportunities for young people, as well as fulfilling and rewarding careers. I hope that, following the provision of further information, and following those assurances, the noble Baroness will agree to withdraw her amendment.
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

On Amendment 62, can the Minister answer this deceptively simple question? Why do the Army, in their regulations regarding the minimum service period, discriminate against younger recruits? On the issue of whether this is legal, I am not arguing that it is illegal—but will the Minister confirm for the record that the only reason why this discrimination, which would be unlawful in civilian life, is lawful, is because the Armed Forces benefit from an exemption from the Equality Act 2010 which was put there to allow them to continue to discriminate?

Baroness Goldie Portrait Baroness Goldie (Con)
- Hansard - - - Excerpts

I think I can add nothing more to what I have already provided by way of an explanation for how that system works and why it is there, and why we do not believe that it is as discriminatory as the noble Lord indicates. However, I am happy to look at his remarks in Hansard and see whether I can provide him with a fuller response.

In conclusion, I thank your Lordships for all contributions. I genuinely thought that it was an extremely interesting debate, and I have welcomed the thoughts from contributors all around the Room.