AI experts boycott South Korean university over “killer robots”
A raft of artificial intelligence experts from all over the world have called for a boycott of the Korea Advanced Institute of Science and Technology (KAIST), claiming its actions could lead to a “third revolution in warfare”.
The South Korean institute has partnered with defense manufacturer Hanwha Systems, the country’s leading arms company. A letter signed by more than 50 academics decries the opening of a “Research Center for the Convergence of National Defense and Artificial Intelligence”, warning that it will accelerate the arms race towards autonomous weapons, or “killer robots”.
The researchers and engineers say they will boycott all collaboration with KAIST until the university’s president provides assurances that the new center will not develop autonomous weapons.
“If developed, autonomous weapons will be the third revolution in warfare,” the letter reads. “They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints.
“This Pandora’s box will be hard to close if it is opened. As with other technologies banned in the past like blinding lasers, we can simply decide not to develop them. We urge KAIST to follow this path, and work instead on uses of AI to improve and not harm human lives.”
READ NEXT: Will killer robots make us safer?
Speaking to The Guardian, Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales, says that autonomous weapons “would make the security situation on the Korean peninsula worse, not better”.
“If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South,” he warns.
The letter comes ahead of a United Nations meeting on autonomous weapons, due to take place next week in Geneva. At the convention, experts from across the globe will discuss how best to contain the threat posed to international security by the development of AI-controlled weapons, with fears that the creation of “killer robots” will create new forms of warfare.
South Korea has so far been a leader in developing these controversial systems. Lethal autonomous weapons like the Samsung SGR-A1, for example, are deployed along the demilitarised zone between North and South Korea. Russia has also made a number of advances in unmanned weaponry in recent years, such as with its grenade launcher-carrying Nerehta ground unit.
KAIST’s president Sung-Chul Shin responded to the boycott with a denial that the institute is developing lethal autonomous systems: “I would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots,” he said in a statement.
The academics’ letter is not the first warning from researchers over the threat of autonomous weapons. In 2015, a collection of AI experts – including the late Stephen Hawking – similarly called for a “third revolution in warfare” to be stopped before it goes too far: “Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control,” the open letter concluded.