Committee launches inquiry on the use of AI in weapons systems
Call for evidence focuses on artificial intelligence in the defence sector
The Artificial Intelligence in Weapons Systems Select Committee published a call for evidence on 6 March, which aims to gather the public’s views on the use of artificial intelligence (AI) in weapons systems.
The Committee was appointed on 31 January 2023 specifically to consider the use of AI in weapon systems, in order to be able to advise the government on how autonomous weapons systems (AWS) should be developed and utilised.
The nature of AWS, also known as lethal autonomous weapons systems (LAWS), are weapons systems which can select, detect and engage targets with little to no human intervention. Such systems include fully autonomous weapons that can operate without any human involvement, to semi-autonomous weapons that require human action to launch an attack.
The Committee cites advances in robotics and digital technologies, and concerns around the ethics of AWS as the reasons for the inquiry. The concerns include their compliance with international humanitarian law, how such systems can be used reliably and safely, and whether there is a risk that such systems might cause wars to escalate more quickly.
The Committee’s inquiry will focus on a range of issues associated with AWS, including the challenges, risks and benefits of AWS; the technical, legal and ethical safeguards that are needed to ensure that they are used safely, reliably and accountably; and the sufficiency of current UK policy and the state of international policymaking on AWS. According to the Committee, international policymaking concerning AWS has mostly been directed towards restricting their use, either through an outright prohibition or specific limitations.
Chair of the Committee on Artificial Intelligence in Weapons Systems, Lord Lisvane, said: “Artificial intelligence features in many areas of life, including armed conflict. One of the most controversial uses of AI in defence is the creation of autonomous weapon systems that can select and engage a target without the direct control or supervision of a human operator. We plan to examine the concerns that have arisen about the ethics of these systems, what are the practicalities of their use, whether they risk escalating wars more quickly, and their compliance with international humanitarian law. Our work relies on the input of a wide range of individuals and is most effective when it is informed by as diverse a range of perspectives and experiences as possible. We are inviting all those with views on this pressing and critical issue, including both experts and non-experts, to respond to our call for evidence by 10 April 2023."