Daily BulletinDaily Bulletin

The Conversation

  • Written by The Conversation Contributor

Diplomats from around the world met in Geneva last week for the United Nations' third Informal Expert Meeting on lethal autonomous weapons systems (LAWS), commonly dubbed “killer robots”.

Their aim was to make progress on deciding how, or if, LAWS should be regulated under international humanitarian law.

A range of views were expressed at the meeting, from Pakistan being in favour of a full ban, to the UK favouring no new regulation for LAWS, and several positions in between.

Despite the range of views on offer, there was some common ground.

It is generally agreed that LAWS are governed by international humanitarian law. For example, robots cannot ignore the principles of distinction between civilians and combatants, or proportionality in the scale of attack.

Human commanders would also have command responsibility for their robots, just as they do for their service men and women. Robots cannot be lawfully used to perpetrate genocide, massacres and war crimes.

Beyond that, there are broadly four positions that the various nations took.

Position 1: Rely on existing laws

The UK’s position is that existing international humanitarian law is sufficient to regulate emerging technologies in artificial intelligence (AI) and robotics.

The argument is that international humanitarian law was sufficient to regulate aeroplanes and submarines when they emerged, and it will also cope with many kinds of LAWS too. This would include Predator drones with an “ethical governor” – which is software designed to determine whether a strike conforms with the specified rules of engagement and international humanitarian law – or autonomous anti-submarine warfare ships, such as the US Navy’s experimental autonomous Sea Hunter.

Position 2: Ban machine learning

The French delegation said a ban would be “premature” and that they are open to accepting the legality of an “off the loop” LAWS with a “human in the wider loop”. This means the machine can select targets and fire autonomously, but humans still set the rules of engagement.

However, they were open to regulating machine learning in “off the loop” LAWS (which do not yet exist). Thus, they might support a future ban on any self-learning AI – similar to AlphaGo, which recently beat the human world Go champion – in direct control of missiles without humans in the wider loop. The main concern is that such AIs might be unpredictable.

Position 3: Ban ‘off the loop’ with a ‘human in the wider loop’

The Dutch and Swiss delegations suggested “off the loop” systems with a “human in the wider loop” could comply with international humanitarian law, exhibit sufficiently meaningful human control and meet the dictates of the public conscience.

The UK, France and Canada spoke against a ban on such systems.

Advocates of such robotic weapons claim they could be morally superior to human soldiers because they would be more accurate, more precise and less prone to bad decisions caused by panic or revenge.

Opponents argue they could mistarget in cluttered or occluded environments and are morally unacceptable.

For example, the Holy See and 13 other nations think a real-time human intervention in the decision to take life is morally required, so there must always be a human in the loop.

This position requires exceptions for already fielded “defensive” weapons such as the Phalanx Close-In Weapon System, and long-accepted “off the loop” weapons such as naval mines, which have existed since the 1860s.

Position 4: Ban ‘in the loop’ weapons

Pakistan and Palestine will support any measure broad enough to ban telepiloted drones. However, most nations see this as beyond the scope of the LAWS debate, as humans make the decisions to select and engage targets, even though many agree drones are a human rights disaster.

image The Northrop Grumman X-47A Pegasus drone is being trialed by the US Navy. DARPA

Defining lines in terms of Turing

Formally, an AI is a Turing machine that mechanically applies rules to symbolic inputs to generate outputs.

A ban on machine learning LAWS is a ban on AIs that update their own rule book for making lethal decisions. A ban on “wider loop” LAWS is a ban on AIs with a human-written rule book making lethal decisions. A ban on “in the loop” LAWS is a ban on robots being piloted by humans being used as weapons at all.

Opinions also differ as to whether control of decisions by Turing computation qualify as meaningful or human.

Next steps

The Geneva meeting was an informal expert meeting to clarify definitions and gain consensus on what (if anything) might be banned or regulated in a treaty. As such, there were no votes on treaty wording.

The most likely outcome is the setup of a panel of government experts to continue discussions. AI, robotics and LAWS are still being developed. As things stand, the world is at Position 1: relying on existing international humanitarian law.

Provided an AlphaGo in charge of missiles complied with principles like discrimination and proportionality, it would not be clearly illegal, just arguably so.

Authors: The Conversation Contributor

Read more http://theconversation.com/world-split-on-how-to-regulate-killer-robots-57734

Why the Black Lives Matter protests must continue: an urgent appeal by Marcia Langton


Sweden eschewed lockdowns. It's too early to be certain it was wrong


‘The Epilogue’ to Traidmarc’s incredible story of conviction


The Conversation


Did BLM Really Change the US Police Work?

The Black Lives Matter (BLM) movement has proven that the power of the state rests in the hands of the people it governs. Following the death of 46-year-old black American George Floyd in a case of ...

a Guest Writer - avatar a Guest Writer

Scott Morrison: the right man at the right time

Australia is not at war with another nation or ideology in August 2020 but the nation is in conflict. There are serious threats from China and there are many challenges flowing from the pandemic tha...

Greg Rogers - avatar Greg Rogers

Prime Minister National Cabinet Statement

The National Cabinet met today to discuss Australia’s COVID-19 response, the Victoria outbreak, easing restrictions, helping Australians prepare to go back to work in a COVID-safe environment an...

Scott Morrison - avatar Scott Morrison

Business News

Reinventing The Outside Of Your Office

Efficient work is a priority in most offices. You need a comfortable interior that is functional too. The exterior also affects morale. Big companies have an amazing exterior like university ca...

News Company - avatar News Company

Kaspersky and Ferrari partnership: tailoring cybersecurity for an iconic brand

Kaspersky is commemorating the 10 year anniversary of its strategic partnership with iconic, global brand - Ferrari. The cybersecurity company is a sponsor of the brand’s Formula One racing team...

News Company - avatar News Company

Instant Steel Solutions Review

Are you keen on having the right guidance, knowledge and information about the right kind of steel purchases for your industries? If yes, then you are in the right place. There is no doubt that ...

a Guest Writer - avatar a Guest Writer

News Company Media Core

Content & Technology Connecting Global Audiences

More Information - Less Opinion