The fourth session of the newly created Special Committee on Artificial Intelligence in Weapon Systems heard evidence from experts on ethical and human rights issues.
On April 27, 2023, the fourth evidence session of the Artificial Intelligence Weapons inquiry took place. The focus was on gathering evidence regarding human rights and ethical concerns related to autonomous weapon systems. The session included Verity Coyle, Senior Campaigner and Advisor from Amnesty International, a member of the global Stop Killer Robots campaign; Mariarosaria Taddeo, Associate Professor and Senior Research Fellow at the Oxford Internet Institute, University of Oxford; and Alexander Blanchard, Digital Ethics Research Fellow at the Alan Turing Institute.
This marked the first participation of UK-based members of Stop Killer Robots in this forum, advocating for an international treaty to regulate and establish limits on the use of autonomous weapon systems. During the public evidence session, the expert witnesses provided their insights on the technical, legal, moral, and societal concerns surrounding autonomous weapons and advocating for action to address this significant emerging security threat.
The UK Campaign to Stop Killer Robots hopes the Committee’s report will reflect the compelling arguments put forth, which highlight the need for strict legislation to reign in the potential harm that AWS could cause. The most effective way to do this will be for the UK to work with the international community to urgently negotiate a binding legal framework, including prohibitions and positive obligations to regulate these weapons.
Some key points participants raised in their evidence included:
Verity Coyle, Senior Campaigner and Advisor from Amnesty International, member of the UK Campaign to Stop Killer Robots:
“The use of AWS, whether in armed conflict or in peacetime, implicates and threatens to undermine fundamental elements of international human rights law, including the right to life, the right to remedy, and the principle of human dignity. The threat to human dignity posed by delegating life and death determinations also presents ethical problems and raises concerns under the principle of humanity under IHL’s Martens clause.”
Mariarosaria Taddeo, Associate Professor and Senior Research Fellow at the Oxford Internet Institute, University of Oxford:
“Autonomous weapon systems raise other serious human rights concerns outside of situations of armed conflict, threatening the right to life, the prohibition of torture and other cruel, inhumane or degrading treatment or punishment, and the right to security of person. We are currently undergoing a technological revolution in many fields driven by AI, machine learning, miniaturization, and automation..”
Alexander Blanchard, Digital Ethics Research Fellow at the Alan Turing Institute
“In terms of whether the systems have been used, the United Nations identified, in a report, Kargu-2 by STM as the first use of an autonomous weapons system. As Verity also highlights, the problem is that it is not clear whether these are being used in an autonomous setting. What tends to happen is that an arms manufacturer creates a system and markets it as fully autonomous, because this is good for sales. It gets used and draws publicity, and the manufacturer then changes the specifications that it announced around that system to say, “This isn’t actually a fully autonomous system. It still has levels of human control”.
There is a real question there not only of definition but of transparency of the use. I am not sure how you address that, but it is about knowing whether this system was indeed used as a fully autonomous, or autonomous, weapons system.”
The recorded session is available here:
This inquiry accepted evidence after extending the deadline to 4.00 pm on Monday, 8 May 2023.
As UKCSKR members, we work to obtain the UK Government’s support for an international treaty that:
a) Prohibits autonomous weapons that cannot be meaningfully controlled and those that target humans
b) Regulates other autonomous weapons to ensure meaningful human control over the use of force
We are a member of the global Stop Killer Robots campaign – a coalition of over 180 organisations working across more than 60 countries.