Daily Bulletin

The Conversation

  • Written by Toby Walsh, Professor of AI at UNSW, Research Group Leader, Data61

Leading researchers in robotics and artificial intelligence (AI) from Australia and Canada have today published open letters calling on their respective Prime Ministers to take a stand against weaponising AI.

The letters ask that Australia and Canada be the next countries to call for a ban on lethal autonomous weapons at the upcoming United Nations (UN) disarmament conference, the strangely named Conference on the Convention on Certain Conventional Weapons (CCW) to be held in Geneva later this month.

To date, 19 countries have called for a pre-emptive ban on autonomous weapons: Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela and Zimbabwe.

Read more: No problem too big #1: Artificial intelligence and killer robots

Before Terminator

Lethal autonomous weapons are often described as “killer robots”. This paints a deceptive picture in most people’s minds.

We’re not talking about a movie-style Terminator, but rather much simpler technologies that are potentially only a few years away. Think of a predator drone flying above the skies of Iraq but replace the human pilot with a computer. Now, a computer could make the final life or death decision to fire its Hellfire missile.

I’m most worried not about smart AI but stupid AI. We will be giving machines the right to make such life-or-death decisions, but current technologies are not capable of making such decisions correctly.

In the longer term, autonomous weapons will become more capable. But my concern then shifts to how such weapons will destabilise the geopolitical order and ultimately become another weapon of mass destruction.

The Australian letter was released simultaneously with one signed by hundreds of AI experts in Canada, including two of the founders of Deep Learning, AI pioneers Geoffrey Hinton and Yoshua Bengio. The Canadian letter urges Prime Minister Justin Trudeau to support such a ban.

In the interest of full disclosure, I organised the Australian letter. It is signed by a dozen or so Deans and Heads of Schools, as well as dozens of professors of AI and robotics. In total 122 faculty members working in AI and robotics in Australia have signed the letter.

The letter says lethal autonomous weapons lacking meaningful human control sit on the wrong side of a clear moral line. It adds:

To this end, we ask Australia to announce its support for the call to ban lethal autonomous weapons systems at the upcoming UN Conference on CCW. Australia should also commit to working with other states to conclude a new international agreement that achieves this objective.

In this way, our government can reclaim its position of moral leadership on the world stage as demonstrated previously in other areas like the non-proliferation of nuclear weapons.

With Australia’s recent election to the UN’s Human Rights Council, the issue of lethal autonomous weapons is even more pressing for Australia to address.

Support is growing

The AI and robotics communities have sent a clear and consistent message over the past couple of years about this issue. In 2015, thousands of AI and robotics researchers from around the world signed an open letter released at the start of the main AI conference calling for a ban.

Most recently, industry joined the call when in August this year more than 100 founders of AI and robotics companies warned of opening “the Pandora’s box” and asking the UN to take urgent action.

The UN is listening and taking action, though like all things diplomatic, progress is not rapid. In December 2016, after three years of informal talks, the UN decided to begin formal discussions within a Group of Governmental Experts. As the name suggests, this is a group of technical, legal and political experts chosen by the member states to make recommendations about autonomous weapons that could contribute to but not negotiate a treaty banning their use.

This group meets for the first time in Geneva next Monday. They will discuss topics such as should autonomous weapons always have “meaningful human control”? and what does this mean in practice?

An AI arms race

The international non-government body Human Rights Watch has invited me to the meeting and I will speak about the dangers of not taking action to ban autonomous weapons. Without a ban, there will be an arms race to develop increasingly capable autonomous weapons.

Read more: Why we signed the open letter from scientists supporting a total ban on nuclear weapons

This has rightly been described as the third revolution in warfare. The first revolution was the invention of gunpowder. The second was the invention of nuclear bombs. This third revolution would be another step change in the speed and efficiency with which we could kill.

For these will be weapons of mass destruction. One programmer will be able to control a whole army. Every other weapon of mass destruction has been banned or is in the process of being banned: chemical weapons and biological weapons are banned, and a nuclear weapons treaty recently reached the 50 signatures required to become law. We must add autonomous weapons to the list of weapons that are morally unacceptable to use.

We cannot stop AI technology being developed. It will be used for many peaceful purposes like autonomous cars. But we can make it morally unacceptable to use to kill, as we have decided with chemical and biological weapons.

This, I hope, will make the world a safer and better place.

Authors: Toby Walsh, Professor of AI at UNSW, Research Group Leader, Data61

Read more http://theconversation.com/dear-prime-minister-wed-like-you-to-join-the-call-for-a-ban-on-killer-robots-86758

Writers Wanted

Why some people find it easier to stick to new habits they formed during lockdown


Why is it worth playing at an online casino?


What is Self-Education? + 4 Ways to Improve Yourself


The Conversation


Prime Minister Interview with Kieran Gilbert, Sky News

KIERAN GILBERT: Kieran Gilbert here with you and the Prime Minister joins me. Prime Minister, thanks so much for your time.  PRIME MINISTER: G'day Kieran.  GILBERT: An assumption a vaccine is ...

Daily Bulletin - avatar Daily Bulletin

Did BLM Really Change the US Police Work?

The Black Lives Matter (BLM) movement has proven that the power of the state rests in the hands of the people it governs. Following the death of 46-year-old black American George Floyd in a case of ...

a Guest Writer - avatar a Guest Writer

Scott Morrison: the right man at the right time

Australia is not at war with another nation or ideology in August 2020 but the nation is in conflict. There are serious threats from China and there are many challenges flowing from the pandemic tha...

Greg Rogers - avatar Greg Rogers

Business News

Cybersecurity data means nothing to business leaders without context

Top business leaders are starting to realise the widespread impact a cyberattack can have on a business. Unfortunately, according to a study by Forrester Consulting commissioned by Tenable, some...

Scott McKinnel, ANZ Country Manager, Tenable - avatar Scott McKinnel, ANZ Country Manager, Tenable

InteliCare triple winner at prestigious national technology awards

InteliCare triple winner at prestigious national technology awards Intelicare wins each nominated category and takes out overall category at national technology 2020 iAwards. Company wins overal...

Media Release - avatar Media Release

Arriba Group Founder, Marcella Romero, wins CEO Magazine’s Managing Director of the Year

Founder and Managing Director of the Arriba Group, Marcella Romero, has won Managing Director of the Year at last night’s The CEO Magazine’s Executive of the Year Awards. The CEO Magazine's Ex...

Lanham Media - avatar Lanham Media

News Co Media Group

Content & Technology Connecting Global Audiences

More Information - Less Opinion