This article was originally published on SGR.

A new report has uncovered the “disturbingly close” relationship between the military industrial sector and UK higher education, especially in areas related to Autonomous Weapons Systems. Nico Edwards, Sussex University, summarises the findings.

Download pdf of article

After years of global campaigning, on 21 October 2022, 70 states delivered the first-ever joint statement on Autonomous Weapons Systems (AWS) – better known as killer robots – to the UN General Assembly. [1]  The statement recognises AWS as a serious threat to global humanitarian, ethical and other pillars, emphasising the need to maintain human control over, and accountability for, the use of force. Still, military powers like the USA, Russia, China, Israel and the UK continue to block UN talks on international legislation to regulate the development and use of killer robots. [2]  The use of so-called loitering munitions and missiles in the ongoing war in Ukraine is complicating the issue further: both Russia and Ukraine have reportedly deployed drones with a variety of autonomous functions, such as artificial intelligence visual identification, recognition and targeting technologies or autonomous flight and navigation capabilities. [3]  While global civil society highlights this as immediate proof of the need for an international ban, many states instead see it as a reason to continue or even increase investments into AWS. The current Europe-wide explosion in military spending is likely to entrench this logic further. [4]
 

Against this background, the UK branch of the Campaign to Stop Killer Robots (CSKR) – to which Scientists for Global Responsibility is affiliated – published a report [5] which evidences the vital role played by universities in facilitating the UK Government’s pursuit of killer robots. It neatly captures how such state encouragement of autonomy in weapons systems undermines Britain’s dedication to humanitarian protection and its respect for the ethics of both war and research.

Since 2020, the CSKR has supported student groups across the country working to increase student and staff awareness of AWS and their links with university settings. This led to a comprehensive investigation into the nature and scope of 13 UK universities’ [6] research work on artificial intelligence (AI) and autonomous systems, evaluating the risk that such research is used in the production of AWS. The report confirmed 65 research programs – relating to sensor technology, AI, robotics, mathematical modelling and human-machine pairing – at risk of being directly incorporated into AWS or indirectly enabling their development. On a scale of lower, medium and higher levels of risk, 17 of these were judged as high: the purpose of the research being to directly contribute to the production of weaponry or development of military capabilities, like targeting and swarming.

This AI-related military-university nexus manifests itself not only through long-running research project partnerships between individual academics, university departments and the Ministry of Defence (MOD), but also through arms companies directly funding research programs (military and dual-use) and providing students with internships and recruitment opportunities. Cranfield University offers an illustrating example, taking pride in its role as a “‘trusted and influential partner to the UK [MOD] for over 30 years,’ having worked on ‘classified projects for government and industry’”. [7]  The university’s comprehensive track record of past and present military collaborations confirms this, focusing on the development of autonomy in aerospace such as through DARTeC [8] – a research centre receiving funding from Boeing UK, Thales, Aveillant, Saab and Raytheon UK. Notably, as one of BAE Systems’ ‘Strategic University Partners’, BAE and Cranfield are together developing a Master’s degree in Applied Artificial Intelligence – a relation that serves to strengthen even further the influence of military manufacturers over curricula, research agendas and thinking around ethics in the university environment. Despite the deep roots of this nexus, the report demonstrates the obstruction of transparency by university boards particularly around research programs with links to the military-industrial sector.
 

State-sanctioned evasion of ethics

The report makes an important point by adopting a wide definition of what kinds of dual-use technologies are at risk of ending up in military settings, broadening the number of research settings – and so the number of researchers – that should be protected by ethical safeguards. In contrast, the investigation confirms the sheer scale to which the targeted universities lack such robust university-wide ethics policies with which to properly scrutinise the potential end-uses of AWS-relevant research, evidencing the utter lack of openness around the nature of AWS-related research and their funding sources.

Interviewing students and staff, CSKR found that researchers are often made to choose between their ethics – such as personal support for national and international regulation of AWS – and the continuation of their projects. Equally often, researchers admitted to being unaware of the risks associated with dual-use technologies, particularly regarding their potential use towards the development of AWS. As the investigators note, this substantiates the worrying reality “that the absence of adequate safeguarding policies place a disproportionate burden on individual academics to assess the moral and practical implications of their work.” [9]  This is something which, in turn, helps further remove university boards as well as MOD and arms industry staff from the accountability that should accompany research with possible military applications.

The report also observed that all publicly funded target institutions specialising in sensitive technology for military purposes perpetuate a “complacent and permissive research environment” significantly increasing the risk that “the proceeds of UK publicly funded research could be incorporated into AWS.” [10]  Notably, even in instances where the university is hosting research on the ethics of AI and AWS, like Bristol’s Robot Ethics or Southampton’s DRONETHICS programs, these are run parallel to projects with high risk of facilitating the development of killer robots.

To follow up on the report findings, the CSKR has attempted myriad ways of engaging each university board – to little avail. Whether passing motions in Student Union councils, raising awareness among Ethical Affairs Officers, writing open letters, [11] creating petitions [12] or discussing the findings in university newspapers, [13] the student movement has either been met with silence or referrals to existing ethical frameworks – an adherence to which has minor impact at best, as the report makes clear. In an advocacy letter sent out in October to the Vice Chancellors or Principals of the 13 institutions, the campaign offered examples of modest actions each university can take to start developing proper safeguards against the misuse of research with potential military applications. Actions range from improving transparency around research aims and funding, developing rigorous ethical frameworks and training for scrutinising dual-use projects, to signing the Future of Life Pledge. [14]  The few responses received to date is telling of this complacent research environment seemingly favoured by university boards at the expense of their commitment to socially responsible research. 
 

UK universities at a crossroads: humanitarianism or militarism?

Against the ongoing push to use the British military to tackle all sorts of non-military issues – from healthcare to migration to climate change – the CSKR report findings bring an essential message: that an “over-representation of military interests in the university sector is problematic because it distorts focus away from humanitarian aims and facilitates the use of force in addressing global conflicts.” [15]  As corporate and government influence over university research grows larger, UK researchers’ control over both research directions and end-use grows smaller, and with it their ability to pursue socially responsible careers. In the particular context of AWS-relevant research, this diminishing of a commitment to ethics and humanitarian concerns has life and death consequences far beyond Britain’s borders. But universities’ reliance on military grants also has consequences for lived (in)security in Britain, incentivising ever more research into the means of war instead of helping resolve the many social crises facing the British people. [16]

UK universities still have the chance to ensure the independence, transparency and accountability of British research and its dedication to ethics by offering researchers opportunities to put their skills to use for science and technology with responsible rather than violent applications. Sadly, in a country that prides itself on the independence and excellency of its research institutions, CSKR’s work instead shines light on the secretive relations between the government and higher education favouring the monetary interests of a military-state-industrial complex and the political daydreaming of British military grandeur [17] over more ethical forms of scientific research. As such, universities risk aiding and abetting Britain’s continued international legacy as a state with a well-honeyed tongue and bloodied hands, a state that is willing to go to war in the name of human rights and humanitarian protection while simultaneously undermining international law and the global movements working for their preservation.
 

Nico Edwards is a PhD student at the School of Global Studies, Sussex University. She is also SGR’s liaison to the UK branch of the Campaign to Stop Killer Robots.
 

References

[1] CSKR (2022). https://www.stopkillerrobots.org/news/70-states-deliver-joint-statement-on-autonomous-weapons-systems-at-un-general-assembly/ 

[2] Al Jazeera (2021). https://www.aljazeera.com/news/2021/12/18/un-talks-fail-to-open-negotiations-on-killer-robots

[3] Automated Decision Research (2022). https://automatedresearch.org/news/weapons-systems-with-autonomous-functions-used-in-ukraine/; Trager R, Luca L (2022). https://foreignpolicy.com/2022/05/11/killer-robots-lethal-autonomous-weapons-systems-ukraine-libya-regulation/

[4] Breaking Defense (2022). https://breakingdefense.com/2022/03/seven-european-nations-have-increased-defense-budgets-in-one-month-who-will-be-next/; European Defence Agency (2022). https://eda.europa.eu/news-and-events/news/2022/12/08/european-defence-spending-surpasses-200-billion-for-first-time-driven-by-record-defence-investments-in-2021; Qureshi M (2022). https://demilitarize.org.uk/2-trillion-for-war-versus-100-billion-to-save-the-planet/ 

[5] Griffiths E, Sánchez A, Ventura E, Curran-Ross E, Richards J, Sobral M (2022). An investigation into the role of UK universities in the development of autonomous weapons systems. UK Campaign to Stop Killer Robots. p.4 https://una.org.uk/sites/default/files/2022-09/stop_killer_robots_in_uk_universities_-_report_and_annex_-_20_sept_2022.pdf

[6] The Alan Turing Institute, Cranfield University, Imperial College London, University College London, University of Birmingham, University of Bristol, University of Cambridge, University of Edinburgh, University of Manchester, University of Oxford, University of Southampton, University of Strathclyde, University of Warwick. The universities were chosen based on their receiving significant MOD funding and their direct ties to British military technology manufacturers.

[7] Griffiths et. al. (2022), p.30.

[8] Digital Aviation Research and Technology Centre, see: https://www.cranfield.ac.uk/centres/digital-aviation-research-and-technology-centrehttps://www.thalesgroup.com/en/taxonomy/term/7822

[9] Griffiths et. al. (2022), p.24.

[10] Griffiths et. al. (2022), p.4.

[11] Gunde K (2021). https://felixonline.co.uk/issue/1786/comment/autonomous-killer-robots-be-very-afraid

[12] Future of Life Institute (2018). https://futureoflife.org/open-letter/lethal-autonomous-weapons-pledge/. See also: https://www.change.org/p/petition-for-warwick-university-to-sign-the-future-of-life-pledge?redirect=false 

[13] The Rattle Cap (2022). https://www.therattlecap.com/post/stop-killer-robots; Varsity (2022). https://www.varsity.co.uk/news/24350

[14] Conn A (2018). https://futureoflife.org/aws/laws-pledge/

[15] Ibid.

[16] Peace Pledge Union (2022). https://www.ppu.org.uk/news/ministers-face-protests-over-military-spending-during-cost-living-crisis 

[17] For examples of such policy thinking, see these MOD strategies published in 2021:
Climate Change and Sustainability: Strategic Approach:  https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/973707/20210326_Climate_Change_Sust_Strategy_v1.pdf;
Defence and Security Industrial Strategy: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/971983/Defence_and_Security_Industrial_Strategy_-_FINAL.pdf;
Defence in a Competitive Age: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/974661/CP411_-Defence_Command_Plan.pdf