Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They're members of a religion which says that if you do math in your head the right way you'll be correct about everything, and so they think they're correct about everything.

They also secondarily believe everyone has an IQ which is their DBZ power level; they believe anything they see that has math in it, and IQ is math, so they believe anything they see about IQ. So if you avoid trying to find out your own IQ you can just believe it's really high and then you're good.

Unfortunately this lead them to the conclusion that computers have more IQ than them and so would automatically win any intellectual DBZ laser beam fight against them / enslave them / take over the world.



If only I could +1 this more than once! I have learned valuable things occasionally from people in the rationalist community but this overall lack of humility —and strangely blinkered view of humanities and important topics like say history of science relevant to “STEM”—ultimately turned me off to the movement as a whole. And I love science and math! It just shouldn’t belong to people with this (imo) childish model of people, IQ, etc.


According to rationalists, humans don't work together, so you can't add up their individual intelligence to get more intelligence. Meanwhile building a single giant super AI is technologically feasible, so they weigh the intelligence of a single person vs all AIs operating as a collective hivemind.


> According to rationalists, humans don't work together, so you can't add up their individual intelligence to get more intelligence.

An actual argument would be that intelligence doesn't work like that. Two people with IQ 100 cooperating together does not produce an IQ 200 solution.

There is the "wisdom of crowds". If a random member of a group is more than 50% likely to be correct, the average of the group is more likely to be correct than its members individually. But that has a few assumptions, for example that each member tries to figure out things independently (as opposed to everyone waiting for the highest-status member to express their opinion, and then agreeing with it -- in that case the entire group is only as smart as the highest-status member).

But you cannot leverage this to simply invite 1000 random people in your group and ask them to invent a Theory of Everything; because the assumption that each member is more than 50% likely to be correct does not apply in this case. So that is one of the limits of people working together.

(And this already conveniently ignores many other problems found in real life, such as conflict of interests, etc.)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: