Skip to yearly menu bar Skip to main content


Social

What can AI researchers do to help prevent Lethal Autonomous Weapons?

Luisa Zintgraf, Jakob Foerster, Maximilian Igl, Christian Schroeder de Witt


Abstract:

Global challenges like climate change, human rights, and the ongoing pandemic urgently require international cooperation and coordination. Instead, the world is currently being destabilized by a new arms race fuelled by automation and remote controlled weapons like drones and robots. Recent advances in AI research have drastically reduced the technological barrier for the full automation of these weapons, blurring the line between armed drones and lethal autonomous weapon systems (LAWS). LAWS have been recognised as a substantial danger to humanity by many institutions and individuals in the AI community (see this open letter).

Despite considerable campaign efforts, the automation of warfare is progressing quickly, with disastrous consequences. We as AI researchers have the possibility to make a difference, as the military sector depends on civil science and we can make our voice heard in public. But we need to do more, and do it fast.

The aim of this Social is to provide information and discuss how we, as a community, can come together to prevent the further development and proliferation of LAWS. Therefore we also want to discuss the necessity of internationally controlling, disarming and banning remote controlled weapons, such as drones. The Social will consist of talks and a panel discussion from researchers and activists, followed by breakout discussions.

Our invited speakers and panelists are: Stuart Russell, Mary Wareham, Ousman Noor, and Meredith Whittaker

Note: We start in zoom (for the talks and panel) and move to gather.town (for group discussions) afterwards.

Chat is not available.