I feel that people become too entrenched in the way things are (people need jobs to make money to live) and lose sight of the bigger picture: machines doing the work that humans have to do now should be a good thing. That it would not be, because in the current system it would result in a small number of people having great power and wealth while the majority of people have little, means the system should change, not that we must not develop the technology.
If you think that the development of AI to take people's jobs and concentrate power is coming and is bad, then you should want to change the system, because that is what the system is encouraging. If you think that the development of AI to do people's jobs for them and unburden humanity is coming and good, then you should want to change the system because it is not set up to gracefully massive unemployment due to automated efficiency gains.
If you think this whole AI thing is a bit of a nothingburger and not going to have the broad impacts that are being speculated, well carry on then.
The curiosity of humans and drive to create new things and uncover new knowledge is universal, and stronger than any society or culture has proven to be. Technologies destroy societies that don't adapt to them, societies don't destroy technologies that they don't like.
People view handouts as "socialism and bad", even UBI.
Under the current system, in order to get money, you have to do work that is so useful to some client, that they will pay you. Half of all Americans are working for corporations. They don't want to be out in the market trying to sell their services. They want stability so they can feed their family. And women want men to have a stable career etc.
The cascading effect of people who get laid off and are told "learn to X, LOL" will overwhelm X, it's like rats from a sinking ship.
Jeffrey Hinton said it the other day -- your utopian vision should help people, but we live in capitalism. So it will do the opposite.
And any historical analogies to what humans did in the past to adapt to challenges and competition are not really applicable because now AI will be far smarter than humans, and at better at organizing. And it will be deployed by governments and corporations which already have most of the power. Individual humans adapting could be as quaint as, say, horses adapting when cars were invented, or oxen adapting when tractors and combines were invented. The adaptation was to breed less horses and oxen. How's the horse population doing today?
I'm not making any analogies to the capabilities of previous technology. I believe it's going to seriously shake things up. I'm saying that the correct response to seeing such power on the horizon is to prepare society to harness it, so that as many benefit from it as possible, rather than be destroyed by it. There is no evidence that people are capable of stopping the technology train, it has never happened.
I am saying that if you worry about the dangers of AI, the rational course of action is to spend your efforts orienting society to have the best chance of benefiting from it, rather than spending your effort trying to prevent its development.
Now you've opened the "comparison to previous technologies" can, not me.
For chemical and nuclear weapons, it makes more sense that they are restricted, because they are explicitly weapons. They have no benefit. The technology that underpins nuclear weapons does have benefit, and there are many nuclear reactors in the world. I don't know very much about chemical weapons, but I am guessing that the same chemistry discoveries that enabled chemical weapons has also gone into making useful chemicals or medicine.
For CFCs, we realized the negative impacts, and successfully internationally coordinated to stop using them. This happened after it became clear that the real harms that were actively occurring were not worth the benefits.
I feel that people become too entrenched in the way things are (people need jobs to make money to live) and lose sight of the bigger picture: machines doing the work that humans have to do now should be a good thing. That it would not be, because in the current system it would result in a small number of people having great power and wealth while the majority of people have little, means the system should change, not that we must not develop the technology.
If you think that the development of AI to take people's jobs and concentrate power is coming and is bad, then you should want to change the system, because that is what the system is encouraging. If you think that the development of AI to do people's jobs for them and unburden humanity is coming and good, then you should want to change the system because it is not set up to gracefully massive unemployment due to automated efficiency gains.
If you think this whole AI thing is a bit of a nothingburger and not going to have the broad impacts that are being speculated, well carry on then.
The curiosity of humans and drive to create new things and uncover new knowledge is universal, and stronger than any society or culture has proven to be. Technologies destroy societies that don't adapt to them, societies don't destroy technologies that they don't like.