UK Campaign to Stop Killer Robots in Universities


This workstream seeks to identify the nature and scope of research work on autonomous systems and AI which is taking place in UK universities, as well as assess the risk that such work could contribute to the development of autonomous weapons systems. It also aims to ensure that researchers are fully aware of both the intended and unintended potential end-use of their technology and understand the possible implications of their work, enabling open discussion about any related concerns. Finally, the UK Campaign works to encourage universities to establish clear policies to guide such research and guarantee that the university will not contribute to autonomous weapons systems development or production.

UK CSKR at Uni: background

British universities play a key role in the government’s pursuit of autonomous systems, with academics and research departments having numerous and well-established research projects with the Ministry of Defence and private arms companies, while failing to put in place the necessary safeguards to ensure that the technology they produce will not be incorporated into autonomous weapons systems or used in harmful ways. 

Against this background, in 2020 the UK Campaign to  Stop Killer Robots launched a  university workstream. Led by students across UK universities, the initiative raises awareness to the absence of transparency, accountability or substantive ethical consideration around research with potential military applications. It also calls on UK academic institutions to address gaps and inadequacies in their ethics frameworks, as well as establish adequate safeguards to address the end-user and dual-use risks associated with autonomous weapons systems relevant technology developed at their premises.

Awareness-raising actions

The campaign conducts a series of actions, including seminars and roundtables, to raise awareness of the links between UK universities and the development of autonomous systems. Some successful events have taken place in Edinburgh, Manchester, Strathclyde and Bristol universities.

We also conducted screenings of the ‘Immoral Code’ documentary, which contemplates the impact of Killer Robots in an increasingly automated world, in various universities, including Newcastle, Cambridge and Warwick

Our report

Between July 2021 and March 2022, a network of student fellows, supported by a team of  experts, conducted an investigation into UK universities’ research programmes with potential implications for the development of autonomous weapons systems.

The report, published in September 2022, found:

  •  At least 65 recent and ongoing projects within the realms of sensor technology, AI, robotics, mathematical modelling and human-machine pairing. Of these, we judged 17 to pose a higher risk of use in the development of autonomous weaponry in the absence of safeguarding policies.
  • In some of the institutions investigated, the development of technology that could advance the development of AWS is carried out via spinout technology companies created by university academics, which directly develop AWS-relevant technologies in partnership with defence bodies.
  • While often not being exposed to ethics modules, students within some of our target institutions are enlisted in activities and initiatives linked to manufactures of aws-relevant technologies, including postgraduate curricula developed with defence company advisors in AWS-relevant disciplines and associated recruitment channels to AWS developers;
  • There is a significant lack of transparency around aws-related research and their funding sources
  • Interviews with individual staff corroborated the need for institutional policies to safeguard universities’ research from being incorporated into AWS. Without safeguarding policies, staff are often made to choose between their ethics (where they may support the regulation of or a prohibition on AWS) and the continuation of their work when deliberating over funding from AWS-related bodies.

Students’ attempts to interact with universities

The UK Campaign student movement repeatedly tried to engage with universities to ensure they establish clear policies to prevent their research from contributing to the development of autonomous weapons systems. We passed motions in Student Union’s councils, spoke to Ethical Affairs Officers,  wrote open letters, created petitions and publicised our concerns in university newspapers (e.g. The Rattlecap and Varsity Cambridge). However, there has been a general lack of willingness to engage or take actions to address our concerns.

Our campaign’s Advocacy Letter

On the 12th of October 2022, we sent out an advocacy letter to the Vice Chancellors/Principals of 13 UK academic institutions calling on them to develop safeguards against the risk that their university’s research will be misused, including by taking the following modest steps: 

  • Making a public statement outlining their opposition to autonomous weapons systems operating without meaningful human control, or that target people, and committing to taking steps to minimise the risks that research carried out at your institution could unintentionally facilitate the creation or use of these weapons. 
  • Signing the well-established Future of Life Pledge, which is endorsed by an increasing numbers of UK and international institutes, universities, tech leaders and academics;
  • Increasing the transparency of their funding sources and research projects, particularly those in controversial areas such as partnerships with military or defence-related actors;
  • Reviewing their ethics frameworks and policies to ensure that research with possible controversial dual-use implications is rigorously scrutinised and measures are taken to increase awareness of the risks around autonomous weapons systems

As demonstrated in the table below, which lists the universities we tried to correspond with, our concerns have been largely unaddressed.

Only three universities have provided responses to our letters. While recognising the potential risk of dual use technologies, we did not feel reassured that the existing mechanisms in place are sufficient to assess or mitigate risks associated with the research proposals considered by these universities. None of the universities contacted committed to following our recommendations, such as endorsing the Future of Life Pledge, which calls for stronger norms and regulations on autonomy in weapons systems; or taking steps to increase the transparency of your funding sources in partnership with military or defence-related actors; or reviewing their ethics frameworks and policies to ensure that research with dual-use implications is rigorously scrutinised.

We will continue to seek opportunities for dialogue while supporting student campaigning activities on campus aimed at ensuring that university research operates under a more transparent and safer environment.

UK CSKR
Resources