Stephen Hawking, Elon Musk warn of AI robot war


Famed physicist Stephen Hawking, SpaceX and Tesla Motors CEO Elon Musk and more than 1,000 other artificial intelligence experts published a letter Monday warning of the deadly consequences if an "autonomous weapons" arms race is sparked.

Weapons controlled by artificial intelligence (AI) would select and engage targets without direct human interaction, unlike cruise missiles or unmanned drones that still require humans to make targeting decisions. If a large military power began developing AI weapons – technology that is merely years away instead of decades away, the authors of letter claim – the consequences would be dire.

"Autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms," according to the letter, which was presented Monday at the International Joint Conference on Artificial Intelligence in Buenos Aires.

Other notable signees of the letter include Apple co-founder Steve Wozniak, DeepMind founder Demis Hassabis, political philosopher Noam Chomsky and director of research at Google, Peter Norvig.

Unlike nuclear, chemical or biological weapons, AI weaponry could be cheaply mass produced and accessible by many state as well as non-state actors. The authors believe that autonomous weapons could easily end up on the black market and then in the hands of terrorist groups.

"If any major military power pushes ahead with AI weapon development," the authors explain, "a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow."

AI weapons would be ideal for destabilizing military actions such as assassinations, genocide and population control.

The authors claim that the militaries of the world must commit to stop an AI arms race before it starts. Furthermore, such military uses would most likely cause a massive public backlash that would smear any beneficial uses of AI.

"Starting a military AI arms race is a bad idea," the authors conclude, "and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."