If any major military moves ahead with developing artificial intelligence weapons, a global arms race of robotic killing machines is virtually inevitable, according to an open letter that Stephen Hawking, Elon Musk and hundreds of other robotics researchers signed. (Tweet This)
Autonomous weapons—or those not controlled by humans—include weapons like armed quad-copters that can search and kill people meeting at a certain place at a certain time. They don't include cruise missiles or remotely piloted drones, which humans control.
"It will only be a matter of time until [autonomous weapons] appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing. ..." the letter from the Future of Life Institute warned of artificial intelligence being pursued as a means of violence.
In many ways, autonomous weapons are more dangerous than nuclear missiles, according to the signatories, and would likely become the weapon of choice for oppressive governments and terrorists.
Unlike nuclear weapons, autonomous weapons don't require expensive or hard-to-obtain materials, so they can easily be mass-produced, cheap and ubiquitous, according to the letter.
Their ability to be mass-produced at a cheap cost means that "autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group."
Musk, co-founder of Tesla and Paypal and founder of SpaceX, is an investor in artificial intelligence company Vicarious. But he is also public about his fears of the possible dangers of AI. Musk signing the open letter is only the latest in a history of statements which include a conversation with CNBC on the issue and a later-deleted Reddit post that read, "I am not alone in thinking we should be worried."
Hawking, physicist, cosmologist and author, also expressed his concern publicly before signing the open letter. "Success in creating AI would be the biggest event in human history," he wrote in a co-authored column last year. "Unfortunately, it might also be the last, unless we learn how to avoid the risks."