
Examples are the Patriot missile system and Samsung’s SGR-A1 in “normal” mode. Human “in the loop”: the robot makes decisions according to human-programmed rules, a human hits a confirm button and the robot strikes. There is much talk of humans and “loops” in the LAWS debate: The phrase “meaningful human control” has been articulated by numerous diplomats as a desired goal of regulation. What has to be regulated or banned is a combination of components, not any one core component.Ĭlose In Weapon Systems already autonomously react to and shoot down incoming missiles without requiring a human to pull the trigger.

The problem is that everything in a LAWS is dual-use – the “autonomy” can be civilian, the lethal weapons can be human operated, for example. Japan has already indicated it will oppose any ban on “ dual-use” components of a LAWS. weapons such as Hellfire missiles), which can also be directly controlled by a human “finger on the button” and are not banned per se.
Autonomous smart sentry gun software#
computer software that targets “virtual” humans in a video game) computer software that targets humans), which is not much different from “non-lethal” cognition (i.e. Sensors (such as radars) which have legitimate civilian uses The real problem for the CSKR is that a LAWS is a combination of three hard to ban components: As their own compilation report shows, most nations have yet to grasp the issue, let alone commit to policy. LAWS are already required to go through Article 36 review before being fielded, just like any other new weapon.Īs a result, the suggestion by the CSKR that swift action is required is not, as yet, gaining diplomatic traction. LAWS that cannot comply with IHL principles, such as distinction and proportionality are already illegal. According to the Red Cross, no expert disputes this. LAWS are already regulated by existing International Humanitarian Law. The British interest is that BAE Systems is working on a combat drone called Taranis, which might be equipped with lethal autonomy and replace the Tornado. The UK government has suggested that existing international humanitarian law provides sufficient regulation. Most diplomats are asking what exactly would they like to ban and why? There were repeated calls for better definitions and more discussions, such as from Sweden, Germany, Russia and China.įew nations have signed up to the CSKR’s view that “ the problem” has to be solved quickly before it is too late. Global Panorama/Flickr, CC BY-SA DefinitionsĪt the end of last year’s meeting, most nations were non-committal. Not everyone is thrilled about the idea of allowing autonomous weapons systems loose on, or off, the battlefield. South Korea has already fielded the Samsung SGR-A1 “sentry robots” on its border with North Korea. Japan and South Korea, by contrast, have big robotics industries. None are not known for their cutting edge robotics. Only five nations currently support a ban on LAWS: Cuba, Ecuador, Egypt, Pakistan and the Holy See. Malaysian Airlines flight MH17, after all, appears to have been shot down by “meaningful human control”. They are potentially more accurate, more precise, completely focused on the strictures of International Humanitarian Law (IHL) and thus, in theory, preferable even to human war fighters who may panic, seek revenge or just plain stuff up. Others disagree, such as Professor Ron Arkin of Georgia Tech in the US, who argues that robots should be regarded more as the next generation of “smart” bombs.

They argue that killer robots must be stopped before they proliferate and that tasking robots with human destruction is fundamentally immoral. The Campaign to Stop Killer Robots ( CSKR) argues for a ban on LAWS similar to the ban on blinding lasers in Protocol IV of the CCW and the ban on anti-personnel landmines in the Ottawa Treaty. LAWS should put in the same category as precision-guided weapons and regulated. LAWS should be put in the same category as biological and chemical weapons and comprehensively and pre-emptively banned.

There are generally two broad views on the matter:
