A group of concerned scientists, researchers and academics, including Elon Musk and Stephen Hawking, have warned that a military artificial intelligence arms race could soon develop if preventative measures are not taken.
A global arms race is "virtually inevitable" if any major military power pushes ahead with AI weapons development, the group cautioned in an open letter presented at the International Joint Conferences on Artificial Intelligence in Buenos Aires.
"The stakes are high," the letter said. "Autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
Other notable tech figures, including Apple co-founder Steve Wozniak and Google DeepMind CEO Demis Hassabis, are also signatories to the letter.
Related: Stephen Hawking hosts his first Reddit AMA
Autonomous weapons -- think pistol-toting Terminators, smart vehicles with mounted machine guns, and self-piloted bomber drones -- aren't just the stuff of science fiction. As the letter notes, some weapons systems are "feasible within years, not decades."
"[The weapons] require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce," the letter states.
"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group," it continues. "We therefore believe that a military AI arms race would not be beneficial for humanity."
Related: Ban 'killer robots' now
Today's high tech warfare is mostly waged via remote control machines. The U.S. military currently has MADSS, a 1,400-pound rover that carries gear and shoots a machine gun. It also has the Protector, a 1,000-pound rover that scans for bombs and fires a bazooka.
But militaries are already experimenting with automated systems. The Israeli "Iron Dome" system detects and shoots down incoming rockets. The "Phalanx CIWS" system used by U.S. naval combat ships does that with a swiveling Gattling gun. The C-RAM system does the same on land using a truck.
In April, Harvard Law School and Human Rights Watch jointly published a report calling for a ban on autonomous weapons.
-- Jose Pagliery contributed reporting.