Researchers call for ‘global stand’ against AI militarisation

Researchers call for ‘global stand’ against AI militarisation Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)


Researchers want to prevent AI from being militarised in an open letter sent to governments calling for a “firm global stand” against it becoming weaponised.

Humans will find any possible way to use advancements for destructive purposes. The obvious example, and one that will hopefully be remembered by global powers as they make their decisions, is that of the nuclear bomb.

AI could become the ‘nuclear arms race’ of the 21st century as nations compete to prove their supremacy. There’s great potential for AI in a defensive capacity, such as shooting down incoming missiles, but there should be concern about its use for direct offense or automated retaliation.

Researchers from Australia and Canada have issued an open letter to the respective prime ministers of each country, Malcolm Turnbull and Justin Trudeau, urging their support in taking a global stand against the weaponisation of AI.

“Our government can reclaim its position of moral leadership on the world stage as demonstrated previously in other areas like the non-proliferation of nuclear weapons,” reads the letter.

A conference is upcoming by the United Nations on the Convention on Certain Conventional Weapons (CCW) which was first concluded in Geneva on October 10th, 1980, and entered into force in December 1983. The convention seeks to prohibit or restrict the use of certain conventional weapons which are considered excessively injurious or whose effects are indiscriminate.

The letter is signed by more than 300 robotics and AI experts and highlights that no human control would cross a “clear moral line” when talking about lethal automated systems. Some people criticise remote-controlled military drones for the potential of targets just being seen as pixels on a screen, but removing the human input altogether could be catastrophic.

“As many AI and robotics corporations—including Australian companies—have recently urged, autonomous weapon systems threaten to become the third revolution in warfare. If developed, they will permit armed conflict to be fought at a scale greater than ever before, and at timescales faster than humans can comprehend.”

“The deadly consequence of this is that machines— not people—will determine who lives and dies. Australia’s AI community does not condone such uses of AI. We want to study, create and promote its beneficial uses.”

The UN’s conference on the CCW was initially scheduled to take place in August, but was rescheduled due to a lack of funding. Talks will now begin from November 14th.

Are you concerned about AI becoming militarised? Share your thoughts in the comments.

 Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

Tags: , , , , ,

View Comments
Leave a comment

Leave a Reply