Elon Musk and 115 robotics experts pen letter to the UN “raising the alarm” on killer robots

Back in 2015, more than 1,000 academics sent an open letter calling for a ban on “offensive autonomous weapons beyond meaningful human control.” A ban has yet to materialise, so just as “Do They Know It’s Christmas?” gets sporadically covered every few years, another open letter has gone out to tie in with UN talks to discuss the future of autonomous weapons.

Elon Musk and 115 robotics experts pen letter to the UN “raising the alarm” on killer robots

It’s shorter, and there are fewer signatories – perhaps because of the time pressures involved for publication to match the UN’s own schedules, or perhaps because getting the talks started in the first place was seen as victory enough for many academics. Either way, there are 116 signatories from 26 countries this time, once again including Tesla founder Elon Musk (who is outspoken on the potential dangers of AI) and Google DeepMind’s Mustafa Suleyman – both veterans of the last letter.

While the 2015 original letter calls for an outright ban on autonomous weapons, the new letter stops short, leaving a lot to the imagination. “We do not have long to act,” the letter reads. “Once this Pandora’s box is opened, it will be hard to close. We, therefore, implore the High Contracting Parties to find a way to protect us all from these dangers.” Though it’s hard to think of a way of protecting us from the dangers of autonomous weapons that doesn’t involve an outright ban, it’s slightly odd the letter doesn’t explicitly call for one.elon_musk_autonomous_weapons_signature

The 123 nations that make up the international convention on conventional weapons voted to formally discuss the issue of autonomous weapons last December, although the British government has previously been resistant to the idea of a complete ban, stating as recently as 2015 that “we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area.”

There are very few parallel universes where such a justification would be classed as “reassuring”, but there we are. The debate is pushing onward, and the robotics experts will hope that their views are taken on board – otherwise, they’ll likely be contributing to the “third revolution of warfare” whether they like it or not.

The letter is published in full below.


As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.

We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies. We regret that the GGE’s first meeting, which was due to start today (August 21, 2017), has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close. We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.

Images: OnInnovation and DSC_0144 under Creative Commons

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.

Todays Highlights
How to See Google Search History
how to download photos from google photos