Daily Bulletin


The Conversation

  • Written by The Conversation
image(Potentially) killer AI tech is already here, built into many less ominous sounding everyday objects.zen_warden, CC BY-NC-ND

When the discussion of “autonomous weapons systems” inevitably prompts comparisons to Terminator-esque killer robots it’s perhaps little surprise that a number of significant academics, technologists, and entrepreneurs including Stephen Hawking, Noam Chomsky, Elon Musk, Demis Hassabis of Google and Apple’s Steve Wozniak signed a letter calling for a ban on such systems.

The signatories wrote of the dangers of autonomous weapons becoming a widespread tool in larger conflicts, or even in “assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group”. The letter concludes:

The endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting.

It’s hard to quibble with such concerns. But it’s important not to reduce this to science-fiction Terminator imagery, narcissistically assuming that AI is out there to get us. The debate has more important human, political aspects that should be subjected to criticism.

The problem is that this is not the endpoint, as they write; it is the starting point. The global artificial intelligence arms race has already begun. The most worrying dimension of which is that it doesn’t always look like one. The difference between offensive and defensive systems is blurred just as it was during the Cold War – where the doctrine of the pre-emptive strike, for example, that attack is the best defence, essentially merged the two. Autonomous systems can be reprogrammed to be one or the other with relative ease.

Autonomous systems in the real world

The Planetary Skin Institute and Hewlett-Packard’s Central Nervous System for the Earth (CeNSE) project are two approaches to creating a network of intelligent remote sensing systems that would provide early warning for such events as earthquakes or tidal waves – and automatically act on that information.

Launched by NASA and Cisco Systems, the Planetary Skin Institute strives to build a platform for planetary eco-surveillance, capable of providing data for scientists but also for monitoring extreme weather, carbon stocks, actions that might break treaties, and for identifying all sorts of potential environmental risks. It’s a good idea – yet the hardware and software, design and principles for these autonomous sensor systems and for autonomous weapons is essentially the same. Technology is ambivalent to its use: the internet, GPS satellites and many other systems used widely today were military in origin.

As an independent non-profit, the Planetary Skin Institute’s goal is to improve lives through its technology, claiming to provide a “platform to serve as a global public good” and to work with others to develop other innovations that could help in the process. What it doesn’t mention is the potential for the information it gathers to be immediately monetised, with real-time information from sensors automatically updating worldwide financial markets and triggering automatic buying and selling of shares.

The Planetary Skin Institute’s system offers remote, automated sensing systems providing real-time, tele-tracking data worldwide – its slogan is “sense, predict, act” – the same sort of principle, in fact, on which an AI autonomous weapon systems would work. The letter describes AI as a “third revolution in warfare, after gunpowder and nuclear arms”, but the capacity to build such AI weapons has been around since at least 2002, when drones transitioned from remote-control aircraft to smart weapons, able to select and fire upon their own targets.

The future is now

Instead of speculating about the future, we should deal with the legacy of autonomous systems from the Cold War, inherited from World War II and Cold War-era complexes between university, corporate and military research and development. DARPA, the US Defence Advanced Research Projects Agency is a legacy of the Cold War, founded in 1958 but still pursuing a very active high-risk, high-gain model for speculative research.

Research and development innovation spreads to the private sector through funding schemes and competitions, essentially the continuation of Cold War schemes through private sector development. The “security industry” is already tightly structurally tied to government policies, military planning and economic development. To consider banning AI weaponry is to point out the wider questions around political and economic systems that focus on military technologies because they are economically lucrative.

Relating the nuclear bomb to its historical context, the author EL Doctorow said: “First, the bomb was our weapon. Then it became our foreign policy. Then it became our economy.” We must critically evaluate the same trio as they affect autonomous weapons development, so that we discuss this inevitability not by obsessing on the technology but on the politics that allows and encourages it.

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond the academic appointment above.

Authors: The Conversation

Read more http://theconversation.com/the-autonomous-killing-systems-of-the-future-are-already-here-theyre-just-not-necessarily-weapons-yet-45453

Writers Wanted

Why is it worth playing at an online casino?

arrow_forward

What is Self-Education? + 4 Ways to Improve Yourself

arrow_forward

Importance of Regular Auto Repairs

arrow_forward

The Conversation
INTERWEBS DIGITAL AGENCY

Politics

Prime Minister Interview with Kieran Gilbert, Sky News

KIERAN GILBERT: Kieran Gilbert here with you and the Prime Minister joins me. Prime Minister, thanks so much for your time.  PRIME MINISTER: G'day Kieran.  GILBERT: An assumption a vaccine is ...

Daily Bulletin - avatar Daily Bulletin

Did BLM Really Change the US Police Work?

The Black Lives Matter (BLM) movement has proven that the power of the state rests in the hands of the people it governs. Following the death of 46-year-old black American George Floyd in a case of ...

a Guest Writer - avatar a Guest Writer

Scott Morrison: the right man at the right time

Australia is not at war with another nation or ideology in August 2020 but the nation is in conflict. There are serious threats from China and there are many challenges flowing from the pandemic tha...

Greg Rogers - avatar Greg Rogers

Business News

Cybersecurity data means nothing to business leaders without context

Top business leaders are starting to realise the widespread impact a cyberattack can have on a business. Unfortunately, according to a study by Forrester Consulting commissioned by Tenable, some...

Scott McKinnel, ANZ Country Manager, Tenable - avatar Scott McKinnel, ANZ Country Manager, Tenable

InteliCare triple winner at prestigious national technology awards

InteliCare triple winner at prestigious national technology awards Intelicare wins each nominated category and takes out overall category at national technology 2020 iAwards. Company wins overal...

Media Release - avatar Media Release

Arriba Group Founder, Marcella Romero, wins CEO Magazine’s Managing Director of the Year

Founder and Managing Director of the Arriba Group, Marcella Romero, has won Managing Director of the Year at last night’s The CEO Magazine’s Executive of the Year Awards. The CEO Magazine's Ex...

Lanham Media - avatar Lanham Media



News Co Media Group

Content & Technology Connecting Global Audiences

More Information - Less Opinion