there are many different arguments people make. one is a moral one, that it isjust morally unacceptable to turn over to a machine the decision to kill a human being. you can basically launch weapons by the million. enough to kill half a city. the bad half. type in a rough description of the mission, you know, "wipe out everyone in this city between the age of 12 and 60." just characterise him, _ release the swarm and rest easy. so you create this weapon of mass destruction that is more effective than nuclear weapons, cheaper, easier to build, easier to proliferate, and doesn't really leave behind a huge radioactive smoking crater. is the answer to always keep a human in the loop? and is the problem with that which human? i think the answer is 'yes'. to disallow attacks