An investigation into the role of UK universities in the development of autonomous weapons systems.

Government investments into technology relevant to the creation of autonomous weapons systems (AWS) are booming. AWS could reshape how wars are fought, but the ethical, legal and practical challenges posed by the technology make it highly controversial. The UN Secretary-General has declared machines with the power and discretion to take lives without human involvement “politically unacceptable, morally repugnant and should be prohibited by international law”.

This report presents the findings of a UK Campaign to Stop Killer Robots investigation into the role of UK university research in aiding the development of AWS. While the authors acknowledge that a vast range of technology could contribute indirectly to AWS, the study uncovers a much more direct relationship between research being undertaken, its funding sources, and the risk that its outputs may be incorporated into AWS.

The study examined 13 institutions that have received significant funding from the UK’s Ministry of Defence and that have links with the developers of AWS-relevant technology – typically UK-based military or arms manufacturers – and found 65 research programmes whose outputs are at risk of either being incorporated into AWS or facilitating their development, even if indirectly.

More broadly, the investigation found a disturbingly close relationship between the defence sector and UK universities, inadequacies and contradictions in universities’ ethics frameworks, a lack of safeguards for the end-user and dual-use risks associated with AWS-relevant technology and a concerning absence of transparency, accountability or ethical consideration around research with potential military applications.

Cover of the Stop Killer Robots in UK Universities report – UK CSKR