It's weird to pick one point in the abstraction chain of human technological progress and create a threshold of unethical behavior.
Technological progress will gaurantee this scenario of military use. If not BD, it will be another company. If not now, it will. If not the US, it will be someone else - China, Russia, etc.
I don't see anypoint in feeling sad about just some geeks making robots as much as we didn't become sad when we saw the transistor being invented. The future is in our hands and tech has always been at the center of military warfare. Focus should be on government policy, electing good leaders and engaging in diplomacy over warfare.
I think it’s important to talk about the concerns related to military robots well before they reach deployment so the public is primed to criticize leaders who would claim their use will be justified. When should we talk about it if not on the release of amazing new capabilities sure to catch the eye of military officials?
ah the " If not us, it will be someone else " argument.
Let's replace that with other things to show how incorrect this logic is.
"If we don't enslave these people, someone else will"
"If we don't bomb these schools, by golly, someone else will!"
The potential hypothetical existence of others acting immorally simply isn't a valid ethical argument to act immorally.
In the same way the response is "how about if instead nobody bombs the hospital" the response here is "how about if instead nobody builds armies of killer robots"
"no death cyborgs" sounds like a pretty easy ask for humanity. This should be well within reach.
That’s not even a remotely close analogy, no offense meant. Robots can be used for many good purposes. Intel makes processors, they can be riding on a missile or used on a CT Scan. Bombing schools is only horrific, how do you come up with this?
I kind of wish we would just enjoy this stupid video of robots dancing. It’s getting tiring to fend off AI-doomsday crowd.
The statement I responded to specifically focused on military application
"Technological progress will gaurantee this scenario of >> military use << . If not BD, it will be another company. If not now, it will. >> If not the US, it will be someone else << "
This argument is
"Technological progress will (always) lead to military use. If not us, it will be someone else"
or:
"The future is guaranteed to be full of war because of decisions made by humans that they somehow have no agency over. Because of this false premise, we need to be eagerly building weapons as fast as possible"
It's a classic argument and it stands up to no scrutiny whatsoever.
So instead the standard response is to generalize it through a deflection: "this is just general forward motion progress" which is exactly what happened.
That's not the point. It's the idea that the fundamental inescapable nature of humans is to be as violent and brutal as possible - which is not true - it's a choice, an act of agency, a matter of policy, it's a decision that is freely made.
Just getting to that simple realization, that barbaric brutal self-destruction takes choices, planning and intentional action that we can simply just choose not to do, would be a groundbreaking epiphany to most.
I'm pretty confident I'm going to die without murdering anyone, just like almost every human ever. It's not natural in the slightest.
Technological progress will gaurantee this scenario of military use. If not BD, it will be another company. If not now, it will. If not the US, it will be someone else - China, Russia, etc.
I don't see anypoint in feeling sad about just some geeks making robots as much as we didn't become sad when we saw the transistor being invented. The future is in our hands and tech has always been at the center of military warfare. Focus should be on government policy, electing good leaders and engaging in diplomacy over warfare.