Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I like the way you challenge the framing, and I agree, the right first question is "should we develop lethal autonomous weapons at all, and if so what kind is ok, what's the limit on that".

The way its asked looks like an attempt to shift the Overton window until autonomous weapons of all kinds are treated as a mundane inevitability not worth worrying about, with just the niggling details subject to ethical questioning.

But big shifts like that are exactly the sort of thing serious ethical codes should be used to watch out for. Not the niggling details afterwards.



Well for ine autonomous weapons were already there. Landmines for one. Back in the stone age even with snares even meaning /rope/ is an autonomous weapon.

There is no human in the loop (no pun intended for snares). It decides when to strike using physics and the answer is always "yes" if it is triggered.

What makes the new "autonomous" weapons different is that they attempt target differentiation. Mobility becomes useful then when weapons systems can say "no" when presented a target. Since even the Military Industrial Complex, purveyor of unneeded bullshit which wantonly takes lives would find it impossible to sell a drone that goes around shooting missles at all targets after launch.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: