Universities have a role to play in stopping the development of Killer Robots
Keeping Ctrl

Universities must not be part of a Killer Robots pipeline
Universities are hugely important in shaping society. They train future generations, pass on knowledge and play a key role in driving innovation. Many important innovations used in everyday life, from seatbelts to touchscreens, come from university research, illustrating the many positive impacts and applications university research can have. Universities collaboration with the military is not necessarily problematic, but the institutions must safeguard their innovations to prevent them from being used for inhumane technology such as lethal autonomous weapons.

Killer Robots
Lethal autonomous weapon systems are weapons that can select and attack individual targets without meaningful human control.

This means the decision on whether a weapon should deploy lethal force is delegated to a machine.

A development that would have enormous negative effects on the way war is conducted.
On the one hand, universities play an important role in the development of new technologies that can have significant implications for international security. This includes developing technologies that could play a key role in lethal autonomous weapons. On the other hand, scientists can play an important part in preventing this from happening.

Do you know whether your university is involved in killer robots technology?


Would you speak out to stop them?

What Universities Can Do

  • Commit publicly to not contributing to the development of lethal autonomous weapons.


  • Establish a clear policy stating that the university will not contribute to the development or production of lethal autonomous weapon systems, and including implementation measures.

  • Ensure university staff and researchers are fully aware of what precisely their technology is being used for and understand the possible implications of their work, and allow open discussions about any related concerns.
LETHAL AUTONOMOUS WEAPONS PLEDGE
"We the undersigned agree that the decision to take a human life should never be delegated to a machine..."
Some scientists of the over 247 organizations and 3253 individuals who have signed this pledge to date:
Max Tegmark
Professor of Physics, MIT; Future of Life Institute
Allison M. Okamura
Professor of Mechanical Engineering, Stanford University, Fellow, Institute of Electrical and Electronics Engineers (IEEE)
Mustafa Suleyman
DeepMind, Co-Founder
Meredith Whittaker
Co-founder, AI Now Institute; Research Scientist, New York University
Sign up to be part of this campaign to stop killer robots.

PAX cares about your privacy. We use your information for our administration, to keep you informed on the progress of this campaign, to ask for your support for future and urgent campaigns and actions, and to incidentally ask you for other means of support. You can opt out of the mailing list though a link in each email. You can retrieve, modify, or delete your information at any time through your PAX-profile, or via info@paxforpeace.nl. Your information will not be distributed to third parties.

The new PAX report 'Conflicted Intelligence' warns of the dangers of university AI research and partnerships, and outlines how universities can help prevent the development of lethal autonomous weapons
Download the Report
Pax conflicted intelligence
Raise awareness
Download and share these images on social media:
Interested in our earlier reports on what states and tech companies are doing?

Read the States Report

Read the Tech Companies Report

Read the Weapons Producers Report
About PAX
PAX works on a wide range of disarmament issues, including arms trade, nuclear weapons, armed drones, and the link between the financial sector and arms producers. PAX is co-founder and steering committee member of the Campaign to Stop Killer Robots.
PAX and Killer Robots
We use cookies to provide the best site experience.
Learn more about PAX and your privacy.
I Accept, don't show this again
Close