Not really in left field at all. I have seen CIA recruiting ads on TV very late at night that use a representation of a mechanical "dragonfly" in the ad to represent a "bug" (as in a listening device). I have seen some of the unclassified research and do not believe such a thing to be far fetched at all. In fact, I would not be surprised to find out that some of this "near future" technology actually already exists. You military folks here (especially Garand), what would be the effect of the detonation of say, 6 oz. or so of comp B or Comp C or even a half stick of dynamite? Assume that it is contained in a metallic or nonmetallic container that is designed to fragment upon detonation in such a way that it will do damage to surrounding personnel. I chose 6 oz. arbitrarily from what little I know of those types of explosives and with the idea that such a weight would be easily transported by a very small mobile device. It occurs to me that a remarkably small robotic vehicle could transport that amount (or possibly much more than that) into a position to do a lot of damage to personnel or decimate a forward position very easily without being noticed in the chaos of a combat action. Even a busy base camp situation might lend itself well to such a combat op.
I too have heard of dissociation via the instruments of war. There are parallels to it in everyday, civilian life. "We didn't mistype that word, the computer screwed it up. It was the slick road, not the guy weaving in and out of traffic at 90 mph, that caused the accident." That sort of thing. It is human nature to not take responsibility for bad things happening if we can avoid it. This is especially true in situations that result in the loss of life and limb. Very few troops in a combat situation actually want to kill the enemy. They want to survive the combat. This puts their basic survival instinct to live in direct opposition to their moral imperative to avoid taking human life. Generally, the will to live must triumph in order for the soldier to survive, though not always. The common, though not universal, outcome is for the soldier to transfer the "blame" for enemy casualties they inflicted to whatever they can, including to the enemy. The other, also common, result is for the soldier to come to terms with the situation and learn to live with the justified taking of life that they did. This is not easy in the best, most clear cut, of circumstances. It is usually not a fast process either. I know of no place in any religious text, that does not outright declare all taking of life is wrong, where it is written that killing in declared combat is wrong. Nor do I know of anyplace where it is written that killing in self defense is wrong in those same texts. For all of that however, our morality generally refuses to take the act lightly, nor will it allow us to. This "terminator" I was theorizing about above would be unique in the arsenal of weapons. It would walk like a human, talk like a human, look like a human, and act like a human; but it would not be a human. It strikes me that these characteristics would combine to make it especially easy to blame the machine over the controller. If such a machine were to be programmed with a true AI that would allow it to act on its orders independently of active human control, it would be easier yet and most compelling to blame the "thinking machine".