Under existing laws, computer programmers, manufacturers and military commanders all would escape liability for deaths caused by such fully autonomous weapons, according to a study published Thursday by Human Rights Watch and Harvard Law School.
“Fully autonomous weapons do not yet exist,” the report says. “But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defense systems — such as the Israeli Iron Dome and the U.S. Phalanx and C-RAM — that are programmed to respond automatically to threats from incoming munitions.
“The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force."
The report is being released in the runup to an international meeting on lethal autonomous weapons systems at the U.N. in Geneva starting April 13.