When you look at the many different potential sources of, "Fucked... goodbye..." the trend is very clear. We're juggling apocalypses and just waiting to find one that will stick. Will it be ocean acidification first? Ocean levels rising? Catastrophic weather patterns? Famine? Drought? War touched off by rapid changed? Mass extinctions?
They're all in play, each with the potential to end us.
Is this why we have not yet found signs of intelligent life in the universe? Every time a civilization is advanced enough to really screw up (their) planet, inevitably, they do? Fermi Paradox redux.
At least for our class of life: organic. Organic life naturally evolved & so all intelligent organic life lives on top of a pile of stored chemical energy (oil / coal / methane gas)
Organic life evolves through competition and so when chemical energy is discovered it is rapidly used, most often faster than such intelligent life can figure out what a truly bad idea that is.
Once they realize, it's too late. Runaway climate change is occurring. And this ends the intelligent life.
It's not a particularly good candidate I would think. For example we know what's happening and could theoretically have stopped it but didn't/don't have the will.
The Fermi paradox requires it to be extremely unlikely to survive, but you could imagine scenarios on other worlds where the right kind of political situation was around to stop it.
The problem is that any race gets increasingly powerful as technology and science improve, so it takes a smaller and smaller mistake to end us all. Also, the population of those who are able to end us increases. Right now very few people have the ability to save or destroy us. One day every human might be able to, unless we hand the keys to the robots and we aren't certain how that'll play out.
Edit: and we affect our environment more, which is always a battle with our level of understanding to manage that. Right now it's the climate. One day it could be the sun.
>Right now very few people have the ability to save or destroy us. One day every human might be able to, unless we hand the keys to the robots and we aren't certain how that'll play out.
Ironically one of the more likely ways an individual could end humanity (in the future) is via the haphazard or malicious creation of said robots.
If all it takes is someone to execute a bootstrapping process on a large amount of computers, we're in trouble. There's no 100% reliable way to prevent that short of what you said: handing over the keys. It would require the creation of an entity so omnipotent and powerful that it may as well be considered a god.
Another way to look at it is the creation of an operating system kernel for humanity itself, with each individual human being akin to a user-land process.
> but you could imagine scenarios on other worlds where the right kind of political situation was around to stop it.
I can't imagine that. The only evidence I have to date (from a sample size of 1 planet) is that there is next to zero real political will, even if the politicians of the world create additional heat saying they have in fact found the political will.
Massive changes do happen. But - the changes are through economics and culture, not through politics. Culture guides the long term behaviours in a civilization, economics play key part in what resources they have to use, politics only guide the short term effects generally.
I agree, it's unlikely a massive change can happen through a political shift (which happen only generally in revolutions and civil wars). A cultural change, however, is completely feasible.
If Facebook for instance turned into a zero-carbon marketing machine I'm pretty sure they could convince the planet to shift gears in consumption. Is the any plausible scenario where they would do this? I suppose if the global catastrophe could be computed as certainly as a trajectory of a asteroid that was about to hit earth - maybe.
I feel like your imaginative capacity could stand to be exercised more. There are hundreds, if not thousands, of ways in which intelligent entities could meaningfully differ from us, and which would lead to different political circumstances.
And we ourselves are not that far from being able to deal reasonably with problems like this. If nothing else were different about us, except that average IQ were, say, 120 on the current scale, I think we'd be able to respond sensibly to non-immediate threats.
There is no reason to believe that intelligence, in whatever form it takes, ever stops being energetically expensive. Is it possible to imagine selection pressures that would select for high intelligence over other factors? How harsh can an environment get, or how easy, before something like brute strength or rapidly breeding is always a cheaper strategy?
Some rules are universal, because they're thermodynamic. That's not to say that your species of intelligent individuals is impossible, but it is unlikely. When you really take the time to think through these scenarios, you often find that there is no justification to assume that exceptional intelligence would evolve.
We already know that thermodynamics and evolutionary pressures can allow for intelligent beings with a population average IQ of 100 (on the arbitrary scale we've devised for ourselves) to exist, since we humans do exist. I'm suggesting that a slight increase in that population average might be enough to make a large difference in how we collectively respond to mid- to long-term problems.
If you're arguing against that suggestion, it seems you would have to be arguing that human intelligence exists at a level that is nestled directly up against some kind of hard thermodynamic limit, even though a non-trivial portion of our species already exists on the other side of that line. I don't see how that argument could plausibly be made.
You could liken our situation to yeast in a bottle of wine. Given that comparison, what good would an average 20 iq points do? I don't see how any species could not evolve for conditions of scarcity, and I don't see how sudden abundance wouldn't lead down the same path we're on. If anything smarter would be worse.
But, like you say, we don't know we're screwed, so it's a lot of supposition.
Yeast can't make metaphors, can't understand that they're in a fermenting bottle with limited food and increasing amounts of excreted toxins, and couldn't imagine or implement plans to do anything about that situation even if they could comprehend it. Humans can do all of those things - except, so far, muster the collective political will to take meaningful steps. So I don't know in what sense it would be useful to make that analogy.
If you don't think 20 IQ points make a significant difference, consider some questions: do you believe that the average IQ 100 person is more or less likely than the average IQ 120 person to be concerned about global climate change? What about IQ 100 vs. IQ 80? If IQ numbers have no tangible meaning to you, do you expect the average Walmart shopper to be more or less concerned about climate change than the average graduate student (in any field at all, your choice: agricultural engineering just as much as climatology or gender studies).
Sorry, but you conflate IQ with individual values that are acquired mainly through culture and education.
A higher IQ does not automatically mean that a person will care about global problems that have no immediate negative effect on their own life, affect everyones wellbeing in the long run and are hard to solve. Or that a society with a higher average IQ will magically create political and economical structures that reward working towards common good instead of rent seeking.
Case in point - Silicon Valley has probably an average IQ around 120-130. How many truly planet-saving innovations have been created there? I mean, you fail to eradicate powerty in SF, your backyard. Because why bother if creating ever new and exciting ways to share cat pictures is infinitely more personally rewarding in the short run and who gives a f about anything else?
So, my point is, higher IQ is not a solution in and of itself, since human nature - instinct to care about oneself first and foremost, will still be present. This instinct can be only tamed by education, culture and political and social structures that promote common good. The chances that a society will evolve these kinds of structures is a toss-up. It is a lot more likely that the outcome will be exactly the same as what we see now - much like adding a 1.2 multiplier to all stats for both mobs and pcs in a computer game - numbers are different, but it is still the same game.
The key is culture, and education, and conscious work towards better social, economic and political structures. But I am quite sure that we will all kill ourselves a long time before any of that will happen.
I'm not conflating IQ with values, I'm arguing that as average IQ goes up, the percentage of the population who will recognize that certain problems of non-immediate import will inevitably become immediate, personal problems, will also go up.
I agree with you that the mechanisms through which greater awareness and active concern for seemingly abstract problems will trickle into the global zeitgeist are things like education, culture, and social/political structures. But where do those come from, and what allows people to perceive their value and meaning? You can't teach people things that you don't understand, and they can't learn things that they can't understand. Better cultural and social structures are a result of better understanding; better understanding is more common with brains that are better able to perceive patterns and project consequences.
I see your point. I would argue, howevever, that very few of the issues we are facing at the moment stem from lack of understanding. We have most of the knowledge we require to instantiate a radically better-off civilisation, and, most importantly, we have the tools to continue gathering and applying said knowledge and iterate on said civilisation.
But we choose not to. There are loads of VERY smart people in the top positions of gigantic multinationals. They do not lack for IQ, or for information that without a shadow of doubt informs them about the long-term consequences of their actions. They choose, daily, to apply their IQ and information not to create solutions that would benefit humanity as a whole, but to maximize the short-term gains for themselves and (hopefully) their shareholders. The reasons for this behaviour are multitudinous and complex, but I would argue that lack of IQ is not a major contributing factor, quite the opposite.
And all that could still fall in the category of "fine, whatever", if in their greed they would not actively destroy the education system already in place, because people who lack basic knowledge and tools for critical thinking are that much easier to manipulate. Again, it is not the lack of IQ that is the issue, it is the knowledge and exercise it has been exposed to. I myself have an IQ well above average (Mensa member), but before I found out about heuristics, biases and critical thinking, and learned to apply them, I was an easy prey for "mediums", "spiritual guides" and other "new age" crap.
TL;DR people are easy to manipulate regardless of their IQ, if they haven't had an opportunity to learn to use it.
> I would argue, however, that very few of the issues we are facing at the moment stem from lack of understanding.
My 11 year old nephew sprained his ankle a few weeks ago playing basketball, badly enough that he started walking in an odd way to compensate for the discomfort. He kept playing sports, including soccer on a rough grass field, every day at school recess, despite us having talks several times about how he needs to stay off it and let it recover or else he risks causing lasting damage. He nods and says, "I know" when I tell him that, but he keeps doing it. I think we are collectively much the same: we 'know', but we don't really believe that the abstract, distant bad thing will ever really happen to us.That is not understanding. That is lip service.
My argument isn't that IQ makes every individual behave better - it is the statistical argument that if the population average went up 20 points (and that number is one I just threw out, but I do think it's close), then we might cross a tipping point where there would then be enough people who take abstract, distant bad things seriously enough to take meaningful action, to form a significantly politically effective bloc.
"Yeast can't make metaphors, can't understand that they're in a fermenting bottle with limited food and increasing amounts of excreted toxins, and couldn't imagine or implement plans to do anything about that situation even if they could comprehend it. Humans can do all of those things - except, so far, muster the collective political will to take meaningful steps."
you mean the thing that allows any of those other things you listed matter at all?
and with all those abilities, it may make no difference, so there's no evidence that the difference in intelligence between yeast and humans is sufficient to save us. then why would 20 more points do it? and if yeast survive but people don't, what then? the point is in both the case of the yeast in the bottle and the human case our drive for survival kills us when we move into sudden abundance. greater average intelligence would just bring that abundance on faster.
since we can't seem to steer this ship, we're at the mercy of the currents. despite all our metaphors and plans and concern about global climate change we don't seem able to do anything. is there much of a difference in carbon footprint between your average walmart shopper and your average graduate student, concern notwithstanding?
Bridges require foundations, pier columns, truss cables, and roadways in order to function. Each is necessary to the function of the whole; none has priority, and it doesn't make sense to single out any one of them as the thing that allows any of the other things to matter at all. In particular, it doesn't make sense to point at a bridge under construction that doesn't have the final piece in place yet, sneer at it because it doesn't have that piece, and suggest that it's pretty much exactly the same as no bridge at all.
20 points on average might push enough people from "unable to genuinely perceive long-term threats as threats" to "able to genuinely perceive long-term threats as threats", that it would move us past the political tipping point of collectively caring enough about doing something meaningful to both figure out what to do (devote money and resources to research) and to do it.
Individual carbon footprints are not meaningful - it doesn't matter how well an individual paddles their canoe when that canoe is in a swimming pool on a cruise liner. What matters is how the liner is piloted. If all people suddenly disappeared off the earth, except for those people who happened to be standing in Walmarts at the time, do you think that society would be better or worse at recognizing and handling abstract, long-term threats like global warming than the society composed only of graduate students would be?
Because we are not unified and working towards the same goal. Tribalism is our roots and they will be our ends. That is the problem with too many sentient beings-everyone believes they deserve something. Thus no intilligent life unless they are a single entity.
Competition is not the only response to being one entity among many, and it is not even necessarily the most reasonable response. Cooperation is also possible, and it is one of the necessary elements in every human organization, from the family to global cultures. We have civilization because we are to some degree able to cooperate, and if you simply turn up the dial on that trait, it is easy to imagine that other creatures could exist who were even better cooperators - and who would be collectively even more successful because of it.
Why would such a cooperative require intelligence, never mind high intelligence? We already see such vast cooperatives in fungi, but they have no selection pressures for intelligence. Far from it, competition is the driver that made us "smart".
I'm not saying that cooperative behavior requires intelligence. I'm saying that cooperative behavior is a reasonable and effective choice for creatures intellectually capable of choosing between behavioral strategies, no matter what the evolutionary forces were that developed them to the point of being intelligent.
What if it's not lack of will but game theory? Maybe it's actually impossible for a large complex system to act against these trends because there exists no stable game strategy for the kind of large-scale cooperation that is necessary?
* The Montreal Protocol on Substances that Deplete the Ozone Layer/Vienna Convention for the Protection of the Ozone Layer
Both are examples of multi-state cooperation that dealt with serious world-spanning issues, the Polio eradication has had the most immediate impact, but Ozone Layer depletion has been checked and the numbers will trend down to 1980s levels over the next 20 years.
I don't think polio eradication is a good counter-example. There were no strong economic forces advocating for the status quo; the iron lung industry was quite small.
There are still polio cases today but pretty much only in areas ruled by death cults who don't care whether their people live or die. So game theory isn't a factor there.
It is necessary to note, however, that the great filter does not necessarily occur when a civilization achieves some arbitrary technological level.
In fact, it could be at the point of single cell --> multi-cellular organisms or at the point of sexual dimorphism or developing general intelligence and ability to use tools, etc.
I guess it's also possible there are multiple Great Filters...
That's a good point, and you're right of course. In fact I take some hopeful comfort in the notion that while the leap from organic molecules to single-cells might not have been too big, there are some indication that going to multicellular life was a leap that took quite some time. We still have reasons to hope, but we're doing almost all that we can to extinguish that hope.
Also, another problem that you easily put yourself into and that we did as well, besides the climate change, is that our world is built on non-renewable resources. Easy, abundant access to oil was essential to kickstart the current society as this only demands low tech solutions for success. Could humanity do the same with the current resources if we were forced to start over?
I agree. This is where the bottleneck is. What really hurts for me is knowing just how close we are but how many are willfully choosing to stay in the dark. We really could make the planet habitable for humans for at least a few more centuries while we become multiplanetary but inertia is our biggest obstacle right now, not lack of technology or information.
No. We did not find any signs, because Universe is very very very very very big, live is very very rare and signal speed is finite.
Just look up some visualisation how far our own radio signals have spread since the first were transmitted — they cover just a tiny volume of space.
How would any of these things end us? We've have world war II. That didn't make a dent in the population.
Famine and drought would be localized so we can use international trade to smooth them over, just like we currently do. For long term failures, there will always be other arable places to grow crops, they just might be fewer or more inconvenient than they currently are, and they'll change at an easy human scale speed. China has a 300,000,000 migrant worker population. That's all of the US population migrating every year!
We currently produce far more food than we need, so even losing a big part of production capacity will still leave us with enough to eat, but perhaps with less meat or seafood for poor people.
I think what the point of the parent is saying is that we have never witnessed the likes of something like this and that our system is more fragile than we think. Previous wars and other events have all been leading up to "the big one". We can see it beginning to happen in what you mentioned above or in smaller instances.
Also saying that poor people can just not eat meat or seafood seems a bit distasteful; not that you might be wrong.
>>Previous wars and other events have all been leading up to "the big one".
WW2 was quite dangerous in the sense, it put a very serious problem in front. Say you nuke a nation, the nation refuses to surrender. The issue is you need to keep nuking till you finish them off. That will not just finish off your enemies but a lot of planet along the way.
Moreover, the enemy has begun to employ a new and most cruel bomb, the power of which to do damage is, indeed, incalculable, taking the toll of many innocent lives. Should we continue to fight, not only would it result in an ultimate collapse and obliteration of the Japanese nation, but also it would lead to the total extinction of human civilization.
The key in the above statement is : "Should we continue to fight".
Notice how 'continuing to fight' is an option there. This sort of stuff is scary as it is.
You can't exactly defeat enemies by weapons if they refuse to give up. The only option then is continue, you then cause larger damage to the planet and biological species.
3-4% of world population is a dent. But non-nuclear/non-biological wars are hardly civilization ending events.
Things like deaths of mesoamerican civs due to destroying their environment http://news.nationalgeographic.com/news/2003/03/0313_030313_.... Or all of america after 2nd european invasion. Ever hear of the mississippi mound builders? I'd expect not since there civ was totally wiped out.
And civ bending doesn't mean everyone's dead. Just large enough population drop to break the institutions and social order supported by that population.
I love his ideas, but sometimes they swamp the action. There's less of that in Echopraxia than in Blindsight, as a response to feedback. But Blindsight gives you back story for Echopraxia.
They're all in play, each with the potential to end us.