Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not meaning to be too direct, but you are misinterpreting a lot about rationalists.

In my view, rationalists are often "Bayesian" in that they are constantly looking for updates to their model. Consider that the default approach for most humans is to believe a variety of things and to feel indignant if someone holds differing views (the adage never discuss religion or politics). If one adopts the perspective that their own views might be wrong, one must find a balance between confidently acting on a belief and being open to the belief being overturned or debunked (by experience, by argument, etc.).

Most rationalists I've met enjoy the process of updating or discarding beliefs in favor of ones they consider more correct. But to be fair to one's own prior attempts at rationality, one should try reasonably hard to defend one's current beliefs so that they can be fully and soundly replaced if necessary, without leaving any doubt that they were insufficiently supported, etc.

To many people (the kind of people who never discuss religion or politics) all this is very uncomfortable and reveals that rationalists are egotistical and lacking in humility. Nothing could be further from the truth. It takes tremendous humility to assume that one's own beliefs are quite possibly wrong. The very name of Eliezer's blog "Less Wrong" makes this humility quite clear. Scott Alexander is also very open with his priors and known biases / foci, and I view his writing as primarily focusing on big picture epistemological patterns that most people end up overlooking because most people are busy, etc.

One final note about the AI-dystopianism common among rationalists -- we really don't know yet what the outcome will be. I personally am a big fan of AI, but we as humans do not remotely understand the social/linguistic/memetic environment well enough to know for sure how AI will impact our society and culture. My guess is that it will amplify rather than mitigate differences in innate intelligence in humans, but that's a tangent.

I think to some, the rationalist movement feels like historical "logical positivist" movements that were reductionist and socially darwinian. While it is obvious to me that the rationalist movement is nothing of the sort, some people view the word "rationalist" as itself full of the implication that self-proclaimed rationalists consider themselves superior at reasoning. In fact they simply employ a heuristic for considering their own rationality over time and attempting to maximize it -- this includes listening to "gut feelings" and hunches, etc,. in case you didn't realize.



My impression is that many rationalists enjoy believing that they update their beliefs, but in practice they're human and just as attached to preconceived notions as anyone else. But if you go around telling everyone that updating is your super-power, you're going to be a lot less humble about your own failures to do so.

If you want to see how human and tribal rationalists are, go criticize the movement as an outsider. Or try to write a mildly critical NYT piece about them and watch how they react.


Yes, I've never met anyone who stated they have "strong opinions, weakly held" who wasn't A) some kind of arsehole and B) lying.


I’ve met a few people who walked that walk without being assholes … to others. They tended to have a fairly intense amount of self criticism/self hatred, though. That was more palatable than ego, to be sure, but isn’t likely broadly applicable.


Out of how many such people that you have met?


Well, yeah, I think it's a pretty socially unaware thing to say about yourself out loud, so that's a pretty strong filter there.

It's rather different for a community to say that's a standard they aspire to, which is a lot less ridiculously grandstanding of a position IMO.


not to be too cynical here, but I would say that the most-apt description of the rationalists is that they are people who would say they are constantly looking for updates to their models. But that they are not necessarily doing it appreciably more than anyone else is. They will do it freely on unimportant things---they tend to be smart people who view the world intellectually and so they are free to toss or keep factual beliefs about things, of which they have many, with little fanfare, and sure, they get points for that. But they are as rooted in their moral beliefs as anybody else is. Maybe more than other people since they have such a strong intellectual edifice that justifies not changing their minds, because they believe that their beliefs follow from nearly irrefutable calculations.


You're generalizing that all self-proclaimed rationalists are hypocrites and heavily biased? I mean, regardless of whether or not that is true, what is the point of making such a broad generalization? Strange!


um.... because I think it's true and relevant? I'm describing a pattern I have observed over many years. It is of course my opinion (and are not a universal statement, just what I believe to be a common phenomenon).


It seems that you are conflating theoretical rationalists with the actual real-life rationalists that write stuff like

>The quantum physicist who’s always getting into arguments on the Internet, and who’s essentially always right

“Guy Who Is Always Right” as a role in a social group is a terrible target, yet it somehow seems like what rationalists are aiming for every time I read any of their blog posts


The rationalist community are fish in barrels convinced that they’re big fish in a small pond because that’s what those who moved them from the pond told them to convince them to enter the barrel. Once in the barrel, the fish are told that they will be moved to a big pond so that they can be big fish in a big pond together. If the fish/fishmonger telling you things is bigger than you, they may not share your preferences about where you fit in to the food chain, and they may not even perceive you at all. You are chum.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: