
At a recent Westminster roundtable, parliamentarians, civil society leaders and legal experts gathered to examine the UK’s position on autonomous weapons systems (AWS) in the wake of the 2025 Strategic Defence Review (SDR). The event, chaired by Labour MP Markus Campbell-Savours and co-hosted by the APPG for Human Rights and the Stop Killer Robots campaign, highlighted a growing cross-party and cross-sector consensus on the need for greater regulation and transparency.
The session opened with a clear challenge from Markus Campbell-Savours MP: while the UK acknowledges that AWS will play a role in future military strategy, it has so far failed to fully engage with the legal and ethical risks. The SDR states that “The UK’s competitors are unlikely to adhere to common ethical standards” in developing such weapons but does not outline what steps the UK will take to ensure it does.
The Minister for the Armed Forces, Luke Pollard MP, has asserted that “compliance with international humanitarian law is absolutely essential” and that the UK is committed to upholding “the values of the society that we serve” through international dialogue on lethal autonomy (Hansard, 30 June 2025). However, panellists at the event argued that meaningful action and stronger international leadership was urgently required.
Defining Autonomy in Modern Warfare
Noah Sylvia from the Royal United Services Institute (RUSI) outlined the operational complexity of modern warfare, where AI-enabled systems, uncrewed vehicles and autonomous technologies function as part of an “integrated force” rather than in isolation. He cautioned that the boundaries of autonomy are often unclear, with decision-making sometimes distributed across systems from onboard software to cloud-based data processing.
Sylvia highlighted the MODs current ethical guidance, the Joint Service Publication 936, which addresses the use of AI in defence. While it frames this use in terms of international humanitarian law compliance, it has been criticised for insufficient attention to civilian protection and limited transparency regarding consultation processes.
Civilian Protection and Human Rights
Andy Bailey from the UK arm of the Stop Killer Robots campaign warned of the increasing risk AWS pose to civilians, particularly in systems where facial recognition or social scoring is used to identify targets. He cited reported use of systems such as “Gospel” and “Lavender” which utilise personal data to assign “strike scores” to individuals as examples of where machine-led decisions could result in unlawful harm to civilians.
Bailey argued that such capabilities must be subject to strict regulation and that the UK risks falling behind global best practice. “We need the kind of proactive leadership seen in previous arms control efforts, such as the landmine and cluster munitions treaties,” he said.
Bonnie Docherty, of Human Rights Watch and Harvard Law School, presented findings from her recent co-authored report, A Hazard to Human Rights: Autonomous Weapons Systems and Digital Decision-Making. She emphasised that machines lack the human judgement necessary to apply core legal principles such as proportionality, distinction and precaution in attack.
“Without meaningful human control, these systems cannot meet existing legal standards,” Docherty said. “The UK should support a legally binding international treaty to address these gaps.”
Legal Frameworks and the UN Process
There was broad consensus among panellists that current frameworks—particularly the Convention on Certain Conventional Weapons (CCW)—are inadequate. As Bailey noted, the CCW has made little progress due to consistent blocking by certain states. The return of AWS discussions to the UN General Assembly has created a new opportunity for action, with over 120 states now backing treaty negotiations.
The UK’s position remains that the CCW is sufficient, though it does not explicitly reference AWS. Several speakers urged the UK to reconsider and align with countries like Austria, Ireland, Mexico and Costa Rica, which have emerged as leaders in multilateral negotiations.
The Importance of Clear Terminology
Much of the discussion focused on the need for consistent and clear definitions. Phrases such as “meaningful human control” and “lethality” are frequently used in defence policy but lack legal clarity. Rachael Cox of the International Committee of the Red Cross called for a more precise understanding of how terms like “lethality” are used in UK strategic documents.
Docherty proposed a principles-based approach to future-proofing legislation: rather than listing prohibited technologies, regulation should focus on maintaining human control, ensuring accountability, and preserving moral judgement in the use of force.
Bridging the Gap Between Civil Society and the Military
Participants also discussed how to foster dialogue between military leaders and international humanitarian law experts. Sylvia warned that many military commanders currently have limited engagement with legal and humanitarian communities. Bailey added that local engagement with MOD bases and through constituency channels could help open conversations.
Markus Campbell-Savours MP stressed the importance of parliamentary engagement and encouraged campaigners to write personally to MPs: “A handwritten or original email from a constituent is still one of the most effective ways to be heard,” he said.
Domestic Use and Future Risks
Concerns were raised over potential domestic applications of AWS in policing and surveillance. As Doherty noted, the same systems that threaten civilians in armed conflict could pose serious risks to civil liberties in peacetime if used by law enforcement without robust safeguards.
The question of how to regulate “human in the loop” scenarios, where an operator has only seconds to approve a target, was also debated. All panellists agreed that such cases fail to meet the threshold of meaningful control and must be addressed in future policy.
The session revealed a growing recognition that AWS pose serious challenges, not only in terms of international security, but also to human rights, democratic accountability, and the rule of law. The UK has a window of opportunity to play a leading role in shaping a global regulatory framework. The question now is whether it will rise to the occasion.
For more on this issue, see:
- Hansard, 30 June 2025: Autonomous Weapons Systems Debate: https://hansard.parliament.uk/commons/2025-06-30/debates/7DB819A8-C24C-4B2F-8466-8FF6A8FC5338/AutonomousWeaponsSystems
- Human Rights Watch: “Killer Robots Threaten Human Rights” (April 2025): https://www.hrw.org/news/2025/04/28/killer-robots-threaten-human-rights-during-war-peace
- Stop Killer Robots Campaign: https://ukstopkillerrobots.org.uk