Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, you go off the rails around step 5. "Something being much more intelligent than us means that, in effect, it has almost absolute power over what happens in the world" makes no sense. Since when does intelligence get you power? Are the smartest people you know also in positions of power? Are the most powerful people all highly intelligent?

"whatever the goals of the AI will be, it will achieve them". Dude, if intelligence meant you could achieve your goals, Hacker News would be a much less whiny place.



"Since when does intelligence get you power?" You hit the nail on the head there. Its about I/O. (Just as its about I/O in the original article - garbage in, garbage out). Jaron Lanier makes this point in.

http://edge.org/conversation/jaron_lanier-the-myth-of-ai

"This notion of attacking the problem on the level of some sort of autonomy algorithm, instead of on the actuator level is totally misdirected. This is where it becomes a policy issue. The sad fact is that, as a society, we have to do something to not have little killer drones proliferate. And maybe that problem will never take place anyway. What we don't have to worry about is the AI algorithm running them, because that's speculative. There isn't an AI algorithm that's good enough to do that for the time being. An equivalent problem can come about, whether or not the AI algorithm happens. In a sense, it's a massive misdirection."


As I've said before, the singularity theorists seem to be somewhere between computer scientists, who think in terms of software, and philosophers, who think in terms of mindware, and they seem to have a tendency to completely forget about hardware.

There seems to be this leap from 'superintelligent AI' to 'omnipotent omniscient deity' which is accepted as inevitable by (what for shorthand here is being called the 'lesswrong' worldview) which seems to ignore the fact that there are limited resources, limited amounts of energy, and limitations imposed by the laws of physics and information, that stand between a superintelligent AI and the ability to actuate changes in the world.


You're not engaging with the claim as it was meant. In context, no human being has ever been "much more intelligent" than me. Not in the same way that I am "much more intelligent" than the monkey von Neumann.

You might decide that this means edanm goes off the rails at step four, instead. But you should at least understand where you disagree.


I'm still not sure you could assume ultimate power and achieve everything you desired if you were the only hacker news reader on a planet of 8 billion monkeys.


> I'm still not sure you could assume ultimate power and achieve everything you desired if you were the only hacker news reader on a planet of 8 billion monkeys.

I would think it relatively easy for a human armed with today's knowledge and a reasonable yet limited resource infrastructure (for comparison to the situation of an unguarded AI) to quite easily engineer the demise of primate competitors in the neighborhood. Set some strategic fires, burn down jungles would be the first step. "Fire" might be a metaphor for some technology that an AI might master that humans don't quite have the hang of yet that can be used against them. For example, a significant portion of Americans seem way too easily manipulated by religion and fear, an AI-generated televangelist or Donald Trump figure might be a frightening thought.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: