I'm not affiliated with the rationalist community, but I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary: you can either be right, or not be right, while "being wrong" can cover a very large gradient.
I expect the community wanted to emphasize how people employing the specific kind of Bayesian iterative reasoning they were proselytizing would arrive at slightly lesser degrees of wrong than the other kinds that "normal" people would
use.
If I'm right, your assertion wouldn't be totally inaccurate, but I think it might be missing the actual point.
> I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary
Specifically (AFAIK) a reference to Asimov’s description[1] of the idea:
> [W]hen people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.
"Less wrong" is a concept that has a lot of connotations that just automatically appear in your mind and help you. What you wrote "It's very telling that some of them went full "false modesty" by naming sites like "LessWrong", when you just know they actually mean "MoreRight"." isn't bad because of Asimov said so, or because you were unaware of a reference, but because it's just bad.
> I'm not affiliated with the rationalist community, but I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary: you can either be right, nor not be right, while "being wrong" can cover a very large gradient.
I know that's what they mean at the surface level, but you just know it comes with a high degree of smugness and false modesty. "I only know that I know nothing" -- maybe, but they ain't no modern day Socrates, they are just a bunch of nerds going online with their thoughts.
Sometimes people enjoy being clever not because they want to rub it in your face that you're not, but because it's fun. I usually try not to take it personally when I don't get the joke and strive to do better next time.
Very rational of you, but that's the problem with the whole system.
If you want to avoid thinking you're right all the time, it doesn't help to be clever and say the logical opposite. "Rationally" it should work, but it's bad because you're still thinking about it! It's like the thinking of a pink elephant thing.
>If you want to avoid thinking you're right all the time, it doesn't help to be clever and say the logical opposite.
I don't understand how this is supposed to be relevant here. You seem to be falsely accusing me of doing such a thing, or of being motivated by simple contrarianism.
Again, your claim was:
> but you just know it comes with a high degree of smugness and false modesty
Why should I "just know" any such thing? What is your reason for "just knowing" it? It comes across that you have simply decided to assume the worst of people that you don't understand.
I don't think I'm more clever than the average person, nor have I made this my identity or created a whole tribe around it, nor do I attend nor host conferences around my cleverness, rationality, or weird sexual fetishes.
Rationalism is not about trying to be clever it's very much about trying to be a little less wrong. Most people are not even trying, which includes myself. I don't write down my predictions, I don't keep a list of my errors. I just show up to work like everyone else and don't worry about it.
I really don't understand all the claims that they intellectually smug and overconfident when they are the one group of people trying to do better. It really seems like all the hatred is aimed at the hubris to even try to do better.
I'm not affiliated with the rationalist community, but I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary: you can either be right, or not be right, while "being wrong" can cover a very large gradient.
I expect the community wanted to emphasize how people employing the specific kind of Bayesian iterative reasoning they were proselytizing would arrive at slightly lesser degrees of wrong than the other kinds that "normal" people would use.
If I'm right, your assertion wouldn't be totally inaccurate, but I think it might be missing the actual point.