- This event has passed.
Opaque Investments, lethal autonomous weapons and the decay of democracy
January 25, 2022 @ 12:00 - 13:30
Urgent questions need to be asked around the transparency, accountability and governance of our financial ecosystem. From the spread of terrorism, human trafficking, drugs, corruption to money laundering breaches by major banks – dirty money facilitates a wide range of illegal activities and the UK plays a key enabling role. Economic crime is rife, but so too is the legal but murky funding of an explosion of AI-enabled emerging technologies that’s turning the science fiction of killer robots into a frightening reality. Technology is often being developed by companies, universities and research institutes, unaware that their creations may have unintended, military applications. Meanwhile, contracts are awarded by the UK Government to developers who are funding and lobbying parliamentary activity, including on issues relating to the regulation of this technology. Against this background, in January 2022, the UK Campaign to Stop Killer Robots, Transparency Taskforce, United Nations Association – UK (UNA-UK) and Women’s International League for Peace and Freedom UK (WILPF-UK) held an event with UK parliamentarians from across the political spectrum to shine a light on the insidious flow of legal money funding potentially harmful tech development and influencing UK policy. We are delighted that the event saw the participation of multiple members of our network of parliamentary champions as well as broad cross-party calls for action to address the threat of lethal autonomous weapons. The UK Campaign to Stop Killer Robots together with UNA-UK, WILPF UK and partners will continue to raise awareness around the technical, legal, moral and societal concerns relating to autonomous weapons and advocate for action to address this major emergent security threat. Please email [email protected] if you are interested in finding out more about the campaign and follow @UK_robots on Twitter to stay in touch.