To summarise, photosynthesis increases the amount of molecular oxygen in the atmosphere. It doesn’t create oxygen atoms. Those come, in the recent geological era, from inside the earth in the form of volcanic oxides. The paper suggests that something about the magnetic field influences the rate at which those oxides are belched into the atmosphere.
It is. But the Rationalists, by taking that name as a label, are claiming that they are what the GP said. They want the prestige/respect/audience that the word gets, without actually being that.
(The rationalists never took that label, it is falsely applied to them. The project is called rationality, not rationalism. Unfortunately, this is now so pervasive that there's no fixing it.)
Sure! Rationality is what Eliezer called his project about teaching people to reason better (more empirically, more probabilistically) in the events I described over here: https://news.ycombinator.com/item?id=44320919 .
I don't know rationalism too well but I think it was a historical philosophical movement asserting you could derive knowledge by reasoning from axioms rather than observation.
The primary difference here is that rationality mostly teaches "use your reason to guide what to observe and how to react to observations" rather than doing away with observations altogether; it's basically an action loop alternating between observation and belief propagation.
A prototypical/mathematical example of a pure LessWrong-type "rational" reasoner is Hutter's AIXI (a definition of the "optimal" next step given an input tape and a goal), though it has certain known problems of self-referentiality. Though of course reasoning in this way does not work for humans; a large part of the Sequences is attempts to port mathematically correct reasoning to human cognition.
You can kind of read it as a continuation of early-2000s internet atheism: instead of defining correct reasoning by enumerating incorrect logic, ie. "fallacies", it attempts to construct it positively, by describing what to do rather than just what not to do.
Shortly: believing what is true, and choosing the actions that lead to the things you value.
If the sky is blue and you can verify that by looking at the sky, it is reasonable to believe that sky is blue, and it is unreasonable to believe that the sky is green just because some authority or your favorite political party said so. If you know that eating poison would kill you, and if you want to live a long life, it is reasonable to avoid the poison, and unreasonable to eat the poison.
These are separate skills, because many people know what to do, and yet don't do that, or are good at following their beliefs, but the beliefs happen to be wrong. Studying the rational beliefs is called "epistemology", studying the rational actions is called "decision theory".
In internet discussions inevitably someone mentions the 17th century definition of "rationalism" as opposed to "empiricism" as the only historically valid meaning of the word. But for example, the approach of Karl Popper is often called "critical rationalism", and although the Less Wrong philosophy is different from Popper's, it is closer to him than to the 17th century "rationalists".
(The difference between Popper and Less Wrong in a nutshell: Popper treats arguments in favor of a theory, and arguments against a theory, as two fundamentally different things that follow different rules. Less Wrong treats all arguments the same way, using the Bayes Theorem. In practice, the difference is smaller than it might seem, because Popper's main concern was to never treat a theory as a 100% truth, especially when there is evidence against it, and Less Wrong agrees that you should never treat any theory as 100% likely. The advantage of Less Wrong approach is that you can also apply it to probabilistic theories. For example, one person says that a coin is fair, another person says that actually the coin is 55% likely to come up heads, and 45% likely to come up tails. I have no idea how you would decide this problem from Popper's perspective, because any experimental result is kinda compatible with both theories, it's just the more coinflips you make, the less likely some of those theories become; but there is no clear line when you should call one of them "falsified".)
.
I think this is the kind of thing where you simply can't make everyone happy, no matter what you do. As an analogy, imagine a world where every time you say e.g. "American president", at least three people in the thread remind you that actually Donald Trump is not a president of the entire continent, only of a part of the North America. So the next time you pay extra attention and say carefully "the president of the United States", and then everyone is like: "you mean the American president, why don't you speak simply?".
There is a community around Less Wrong. Whatever we call them, we should choose the name so that it is obvious that we refer to them, and not to the 17th century philosophers who believed that evidence is not necessary. Things are real, words are just labels.
At the beginning, it was just a few dozen people from different parts of the planet, reading the same blog. They referred to themselves as "aspiring rationalists". (As a label that applies to an individual who aspires to be more rational.) I would be okay to use this label, but apparently many people are too lazy to say "aspiring" all the time, and when you only say "rationalists", (1) it sounds smug, as if you believe that you already are perfectly rational, rather than you are trying to become more so, and (2) inevitably, some outsider will mention the 17th century philosophers. I think people used "Less Wrong community", "rationality community", "LessWrong-style rationalists", "x-rationalists", and maybe a few other words. But there was always pushback against using "rationalist" without any adjective.
As a long-time member of the community myself, I don't have a problem with anyone using any of these labels. I know what you mean, and I am not going to play dumb. It's the non-members who need to coordinate on a standard label for us, so that they are not confused about who they are talking about. And I am even okay with them choosing a label that we don't use. (I am doing the same to other groups, e.g. I say "Mormons" instead of "The Church of Jesus Christ of Latter-day Saints".) If the world decides to call us "rationalists", so be it... but then don't blame us for using the same label as the 17th century philosophers, because we don't. I think that "rationality community" or "Less Wrong community" are both nice options. (Just please don't call us TESCREAL because that's a crazy conspiracy theory.)
It's not "unreasonable" if you weren't anthropomorphizing COT, equating it to thinking or "internal dialogue." The results aren't surprising to people in this camp, but I also wouldn't say that makes the work less impactful.
But it would also be more unreasonable to dismiss the fact that a significant portion of the research community (and even greater portion of the public) was operating under these beliefs: that COT was akin to thinking (it's literally there in the name...). It is possible to disagree with something but also not believe someone is being unreasonable by coming to different conclusions.
Trees as shelter works as long there is water for the trees(evaporation cools, like in sweating). With increasing heat stress there is not enough water and the trees will simply die. Or burn.
What is the weight of one of these ear buds? I guess too much weight can easily give you head aches. On the other hand, it seems from the pics that further miniaturization is possible.
My boss actually switched to wearing them all day dogfooding the thing (I work in the same lab: also as a disclosure). For a long time a downside was the audio, but the new version has a really nice driver and amazing DSP optimizations. The colleagues also conducted wearability studies and I also can confirm (having participated) that they are really walearable. Don't have the numbers but they are not really heavier than other in ears from my impression.
I see "Ear Canal Pressure Sensor" is listed on the website.
I know that NASA was doing some research on using the tympanic membrane to measure intercranial pressure. Ask around, please, and find out if anyone has considered that application. We really do need a better way to detect cerebrospinal leak (CSF Leaks). A CSF Leak was a significant contribution to my late wife's death.
I will forward that to my colleagues, who are looking constantly for new applications (particularly in health tech). The sensor exists since Toby's paper on measuring tensor timpani muscle activity [0] .