> (Longtermism is the belief that we should discount short-term harms to real existing human beings—such as human-induced climate change—if it brings us closer to the goal of colonizing the universe, because the needs of trillions of future people who don't actually exist yet obviously outweigh the needs of today's global poor.)
> Finally, I haven't really described Rationalism. It's a rather weird internet mediated cult that has congealed around philosopher of AI Eliezer Yudkowski over the past decade or so. Yudkowski has taken on board the idea of the AI Singularity—that we will achieve human-equivalent intelligence in a can, [...] and terrified himself with visions of paperclip maximizers, AIs programmed to turn the entire universe into paperclips [...] with maximum efficiency.
Here's your shipment of cheap strawmen, where do you want them delivered?
Agree regarding the description of rationalism being a little dramatic, but if anything, the description of Longtermism is generous. As a movement it's a lot closer to religion than philosophy.
> Finally, I haven't really described Rationalism. It's a rather weird internet mediated cult that has congealed around philosopher of AI Eliezer Yudkowski over the past decade or so. Yudkowski has taken on board the idea of the AI Singularity—that we will achieve human-equivalent intelligence in a can, [...] and terrified himself with visions of paperclip maximizers, AIs programmed to turn the entire universe into paperclips [...] with maximum efficiency.
Here's your shipment of cheap strawmen, where do you want them delivered?