Queen’s Speech Debate
Full Debate: Read Full DebateLord Browne of Ladyton
Main Page: Lord Browne of Ladyton (Labour - Life peer)Department Debates - View all Lord Browne of Ladyton's debates with the Foreign, Commonwealth & Development Office
(2 years, 6 months ago)
Lords ChamberMy Lords, in less than three months of fighting Russia has lost one-third of the ground combat force it committed to the invasion of Ukraine. Every wrecked Russian tank—taken out by light anti-tank weapons deployed with minimum training, unslung in seconds from the shoulder and deadly accurate—is further evidence that traditional armies can no longer expect to dominate simply because they have more troops, weapons and money. As weapons have become smaller, more effective and widely dispersed, it has become harder and harder for traditional militaries effectively to achieve their aims through brute force, as they meet resistance at every turn. Resistance in this case includes information warfare, hacking and cyber- attacks, as well as social media, which President Zelensky excels at, casting the conflict in terms of good and evil and projecting an aura of invincibility.
Our own experience in Iraq and Afghanistan underscores the reality of contemporary warfare: that invasion and occupation is more expensive and temporary than it is quick and permanent. As always, the future belongs to those who embrace it and, in this context, who empower the decentralisation of weapons technology, information currency and individual ingenuity and courage.
Less has been said about the use of artificial intelligence in the Ukraine war than about anti-tank missiles but in April, a senior Defense Department official said that the Pentagon—quietly—is using AI and machine-learning tools to analyse vast amounts of data, generate useful battlefield intelligence and learn about Russian tactics and strategy. Just how much the US is passing to Ukraine is a matter for conjecture and I shall not do that. A powerful Russian drone with AI capabilities has been spotted in Ukraine. Meanwhile, Ukraine has itself employed the use of a controversial facial recognition technology. Prime Minister Fedorov told Reuters that it had been using Clearview AI—software that uses facial recognition—to discover the social media profiles of deceased Russian soldiers, which authorities then use to notify their relatives and offer arrangements for their bodies to be recovered.
If the technology can be used to identify live as well as dead enemy soldiers, it could also be incorporated into systems that use automated decision-making to direct lethal force. This is not a remote possibility; last year, the UN reported that an autonomous drone had killed people in Libya in 2020. There are unconfirmed reports of autonomous weapons already being used in Ukraine. We are seeing a rapid trend towards increasing autonomy in weapons systems. AI and computational methods are allowing machines to make more and more decisions themselves.
Our Government see AI as playing an important role in the future of warfighting. The integrated review, presenting AI and other scientific advances as “battle-winning technologies”, set out their priority for
“identifying, funding, developing and deploying new technologies and capabilities faster than our potential adversaries”.
There is an urgent need for strategic leadership by government and for scrutiny by Parliament, as AI plays an increasing role in the changing landscape of war. We need UK leadership to establish, domestically and internationally, when it is ethically and legally appropriate to delegate to a machine autonomous decision-making about when to take an individual’s life.
The development of LAWS is not inevitable, and an international legal instrument would play a major role in controlling their use. In the absence of an international ban, it is inevitable that, eventually, these weapons will be used against UK citizens or soldiers. Advocating international regulation would not be abandoning the military potential of new technology; it is needed on AWS to give our industry guidance to be a sci-tech super- power without undermining our security and values. Weapons that are not aligned with our values must not be used to defend our values. We should not be asking our honourable service personnel to use immoral weapons.
The war in Ukraine has brought home the tragic human consequences of ongoing conflict. The use of LAWS in future conflicts and the lack of clear accountability for the decisions made pose serious complications and challenges for post-conflict resolution and peacebuilding. The way in which these weapons might be used and the human rights challenges they present are novel and unknown; the existing laws of war were not designed to cope with such situations and, on their own, are not enough to control the use of future autonomous weapons systems.
The integrated review pledged to
“publish a Defence AI strategy”.
More than a year later, there is still no sign of it. The Government’s delay in publishing the strategy while the technology is outpacing us means that the UK is unprepared to deal with the ethical, legal and practical challenges presented by autonomous weapons systems today.