future timeline technology singularity humanity
 
Blog»

 

 

18th July 2018

Leading AI companies and researchers pledge to not develop lethal autonomous weapons

More than 2,400 researchers, scientists, engineers, entrepreneurs and others have signed a pledge – organised by the Future of Life Institute (FLI) – promising not to develop lethal autonomous weapons.

 

lethal autonomous weapons

 

In addition to many prominent individuals, the list of signatories also includes over 160 AI-related firms and organisations from around the world – such as Google DeepMind, XPRIZE Foundation, University College London, the European Association for AI (EurAI), Swedish AI Society (SAIS), ClearPath Robotics and OTTO Motors.

The pledge is being announced today at the annual International Joint Conference on Artificial Intelligence (IJCAI) in Sweden, which draws over 5,000 of the world’s leading AI researchers. It states:

Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI.

In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine. There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others – or nobody – will be culpable. There is also a powerful pragmatic argument: lethal autonomous weapons, selecting and engaging targets without human intervention, would be dangerously destabilizing for every country and individual. Thousands of AI researchers agree that by removing the risk, attributability, and difficulty of taking human lives, lethal autonomous weapons could become powerful instruments of violence and oppression, especially when linked to surveillance and data systems. Moreover, lethal autonomous weapons have characteristics quite different from nuclear, chemical and biological weapons, and the unilateral actions of a single group could too easily spark an arms race that the international community lacks the technical tools and global governance systems to manage. Stigmatizing and preventing such an arms race should be a high priority for national and global security.

We, the undersigned, call upon governments and government leaders to create a future with strong international norms, regulations and laws against lethal autonomous weapons. These currently being absent, we opt to hold ourselves to a high standard: we will neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons. We ask that technology companies and organizations, as well as leaders, policymakers, and other individuals, join us in this pledge.

Max Tegmark, president of the Future of Life Institute (FLI), commented: “I’m excited to see AI leaders shifting from talk to action, implementing a policy that politicians have thus far failed to put into effect. AI has huge potential to help the world – if we stigmatise and prevent its abuse. AI weapons that autonomously decide to kill people are as disgusting and destabilising as bioweapons, and should be dealt with in the same way.”

“We cannot hand over the decision as to who lives and who dies to machines. They do not have the ethics to do so,” explains Toby Walsh, Professor of Artificial Intelligence at the University of New South Wales in Sydney, another key organiser of the pledge. “I encourage you and your organisations to pledge to ensure that war does not become more terrible in this way.

“We need to make it the international norm that autonomous weapons are not acceptable,” he adds. “A human must always be in the loop. We cannot stop a determined person from building autonomous weapons, just as we cannot stop a determined person from building a chemical weapon. But if we don’t want rogue states or terrorists to have easy access to autonomous weapons, we must ensure they are not sold openly by arms companies.”

A UN meeting on Lethal Autonomous Weapons Systems (LAWS) is being held next month, and signatories of the pledge hope it will encourage lawmakers to develop a commitment at the level of an international agreement between countries.

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

Comments »

 

 

 
 

 

Comments

 

 

 

 

⇡  Back to top  ⇡

Next »