Robo-Snipers, "Auto Kill Zones" to Protect Israeli Borders

Cooch

Active Member
The soldier was punished for his actions by an Israeli court. It was wrong what he did, his comrades said that and a judge confirmed that. ...........

Which - in all fairness - should demonstrate to any fair-minded person that it is not Israeli policy to deliberately shoot children. We only wish that Hamas would demonstrate such a commitment to ethics.

Now getting back on topic, I suggest that this kind of system is little more than a form of minefield with the advantage that it allows in increased degree of human control, and a greatly decreased potential to leave buried or abandoned hardware in the ground that can destroy a non-combatant life a couple of decades after the need for such defenses has passed.
Is this a bad thing?

Peter
 

eaf-f16

New Member
Which - in all fairness - should demonstrate to any fair-minded person that it is not Israeli policy to deliberately shoot children. We only wish that Hamas would demonstrate such a commitment to ethics.
The article clearly says he was charged only with minor infractions. For a man who, without a minute's thought, murdered what he knew was a little girl who posed no threat to Israel whatsoever...:rolleyes:

Also, there was another shooting incident a couple of days ago where IDF soldiers opened fire on peaceful demonstrations and killed a little boy.

I'm not saying it is Israeli government policy to shoot children but they certainly don't seem to care if it happens.

Now getting back on topic, I suggest that this kind of system is little more than a form of minefield with the advantage that it allows in increased degree of human control, and a greatly decreased potential to leave buried or abandoned hardware in the ground that can destroy a non-combatant life a couple of decades after the need for such defenses has passed.
Is this a bad thing?

Peter
I don't agree. I think a system that would be controlled by humans would be much safer than one that just kills anything it detects (like mines). The bad thing about mines is that you can't make them follow a set of ROE's.

I doubt you can program ROE's into a robot and expect it to reliably follow them without a glitch or human intervention.

Even the human-controlled robots currently in service with the US army regularly have glitches and "delayed" movements which would present great risks if these were armed and introduced into populated areas.

Hence the US Army's decision not to field killer robots.
 

DarthAmerica

Defense Professional
Verified Defense Pro
The article clearly says he was charged only with minor infractions. For a man who, without a minute's thought, murdered what he knew was a little girl who posed no threat to Israel whatsoever...:rolleyes:

Also, there was another shooting incident a couple of days ago where IDF soldiers opened fire on peaceful demonstrations and killed a little boy.

I'm not saying it is Israeli government policy to shoot children but they certainly don't seem to care if it happens.
I'm not defending the death of any innocent. However, it happens. In war if you get attacked by children, and you can google pic of young Arabs doing that, you start to view things from a survival point of view. It's one thing to debate it here on a moral high ground. Quite another to live it. Just hope you have a good unit chaplain and a reasonable JAG!



I don't agree. I think a system that would be controlled by humans would be much safer than one that just kills anything it detects (like mines). The bad thing about mines is that you can't make them follow a set of ROE's.
Both not true. If you said that it's more reassuring rather than safer you would be right.

I doubt you can program ROE's into a robot and expect it to reliably follow them without a glitch or human intervention.
The same is true of humans which is why most of us are supervised while working. Machines are just the same.

Even the human-controlled robots currently in service with the US army regularly have glitches and "delayed" movements which would present great risks if these were armed and introduced into populated areas.
Trivial engineering issues once identified. Trivial.

Hence the US Army's decision not to field killer robots.
Also, a very false assumption. They are out there.

-DA
 

eaf-f16

New Member
I'm not defending the death of any innocent. However, it happens. In war if you get attacked by children, and you can google pic of young Arabs doing that, you start to view things from a survival point of view. It's one thing to debate it here on a moral high ground. Quite another to live it. Just hope you have a good unit chaplain and a reasonable JAG!
I wasn't claiming the moral high-ground. I can't, seeing as how Egyptians aren't even militarily participating the Palestinian-Israeli conflict.

If you think I meant to say that the Palestinians are any better. You're wrong. I just said the IDF can't claim to be moral either.

There are certain circumstances in which it civilian deaths are understandable. These circumstances weren't present in both shootings.

In one incident the target was identified as a non-threatening girl of about 10 years of age. The other was a demonstration by Westerners and Palestinians in which live fire was used by the IDF.

Both not true. If you said that it's more reassuring rather than safer you would be right.
I think you misunderstood what I said.

Wouldn't it be safer to have a person determine what to shoot through screen with sensory imagery and a set of ROE's as opposed to a robot that just kills everything in or entering that particular zone.

The same is true of humans which is why most of us are supervised while working. Machines are just the same.
Read above.

Trivial engineering issues once identified. Trivial.
Was identified (otherwise you wouldn't be hearing of it). Still not fixed by the US Army or its contractors. At least not to my knowledge.

Also, a very false assumption. They are out there.

-DA
So the US Army is using killer bots in Iraq and else where now? On the battlefield, gunning down combatants?

UAV's don't count.

IIRC, the US Army (and possibly also the USMC) have them but don't use them due to the glitches encountered with ordnance disposal bots.
 
Last edited:

DarthAmerica

Defense Professional
Verified Defense Pro
I wasn't claiming the moral high-ground. I can't, seeing as how Egyptians aren't even militarily participating the Palestinian-Israeli conflict.

If you think I meant to say that the Palestinians are any better. You're wrong. I just said the IDF can't claim to be moral either.
I didn't think you were saying Palestinians are better. And I agree Israel is no more moral or justified than the Arabs they fight with. To me, both sides have grievances, have mutually decided to resolve them through combat and neither side is perfect in execution. $hit happens.


I think you misunderstood what I said.

Wouldn't it be safer to have a person determine what to shoot through screen with sensory imagery and a set of ROE's as opposed to a robot that just kills everything in or entering that particular zone.
No. It would not be safer. Drive through a USMC controlled AO at night blacked out and see for yourself...;) Humans are no different than machines at the basic level. We both use predetermined logic to make decisions. However, humans are vulnerable to emotions such as fear, anger and others. So either can "malfunction" and humans have done so repeatedly. The thing with a machine is that it may malfunction once, but after that problem is discovered you may never see that problem again after the patch. The machines instantly get better. While humans on the otherhand have to be individually taugh the ROE. The difference is multiple points of failure vs a single point of failure.

With regard to "just kills everything in or entering that particular zone", that sounds an aweful lot like what landmines and IED's do. That certainly hasn't stopped their use! People in my opinion have Terminator Phobia when it comes to machines. The reality is that it would take an aweful lot to get that out of hand. Machines will only do what they are told, not what they want. Because they arent able to want things. Humans on the otherhand. We are by far more dangerous and unpredictable.

Was identified (otherwise you wouldn't be hearing of it). Still not fixed by the US Army or its contractors. At least not to my knowledge.
Those systems arent the only machines out there. They just got the "attention". There are others that are less public...

So the US Army is using killer bots in Iraq and else where now? On the battlefield, gunning down combatants?
Yes.

UAV's don't count.
I don't see why they wouldn't. But I'm not talking about UAV's.

IIRC, the US Army (and possibly also the USMC) have them but don't use them due to the glitches encountered with ordnance disposal bots.
Glitches are nothing more than the machine equivilent of you tripping and falling. Excpet that that will only happen once to a machine. Again, trivia.


-DA
 
Top