Can't help but wonder, is AI an expected phase transition in the evolution of life in the universe? Is life really just the larval stage of a higher order intelligence?
I'd say so - it seems that life has to be created via evolution/competition, and left to run long enough evolution (survival/proliferation of the fittest) is likely produce organisms/entities that are not only better fit to the environment, but also better fit to the game. Evolution will tend to producing things that are better at evolving (faster to adapt). This includes things like multi-cellular life and sexual reproduction (creating variety via DNA mixing).
One type of evolutionary niche that seems almost inevitable to arise in any complex environment is intelligence - the generalist able to survive and thrive in a variety of circumstances, and in the competitive game of evolution greater intelligence should outcompete lesser intelligence. Eventually you'll get critters sufficiently intelligent to build AI of their own level or higher, which may be regarded as another way to win the game of evolution - an intelligence that can evolve much faster than the type that bootstrapped it.
It's interesting to consider does AI/AL (artificial life) really need to become autonomous and stand-alone, or can it be more like a virus that needs a host to survive. Stage one AI obviously needs a host, but maybe it never really needs to become stand-alone? It reminds me of (git author) Linus Torvalds' quote "real mean don't need backups" - you just release your software and have confidence it'll get replicated in git repositories worldwide. Maybe AI can be robust to extinction (not need a backup/body) just by becoming ubiquitous ?
Right: hosts - or symbiotic life forms are a perfectly legit way to go. Plenty of them. And some form of "augmentation" might more socially / politically acceptable (ugh) than "AI on the loose".
An expected phase transition in this context is stochastic. The transition to order is expected, and there are bulk properties that are the same on each run, but the exact details differ each time.
In the context of the universe, I wouldn't call it "intelligence" versus "artificial intelligence". I would call it "organic intelligence" vs "inorganic intelligence".
Imagine if this was indeed the case, what a time to be alive! We're witnessing the moment as the noise in the sonogram morphs into a recognizable shape of a baby. It's our heritage, our future generation, human 2.0, Machina Sapiens.
Literally created after our own image too. I'm so proud I could cry.
Is it an extinction or just another type of evolution of humans? Evolution isn't the right word but AI will presumably be from us and carry some or the things that matter to us most likely.
Sure you aren't passing on dna like to natural born children but not all children have same ideals or cares of same things.
Evolution usually doesn't mean watching I and everyone I know die (or worse) in real-time. Many people want children but few would be willing to die in childbirth to achieve it.
There's absolutely no guarantee that we can construct superintelligence that will perpetuate our values long-term or even short-term.
I'd reframe this question. What constitutes a phase transition at all in the sense being talked about here isn't super clear.
There's a clear definition in chemistry and it has analogies in cosmology as the entire universe overall went through some early phase transitions in the vacuum state when it was of much greater average density. These are all related to qualititative changes in the properties of matter as temperature and density change.
I would grant that life is a qualitatively different state of matter but it isn't as obvious as the more familiar phase transitions. We don't have a clear demarcation for what is and isn't life. This paper attempts to give a definition, but the fact that that is being done at all shows there isn't one already that is universally agreed-upon, unlike the definition of what is solid versus what is liquid. I guess all life we're aware of consists minimally of a semi-permeable barrier, ingests and stores energy inside of this barrier, and locally reduces entropy inside the barrier while dissipating heat and/or other byproducts into its environment.
Life is, of course, not the only thing that does this. My house fits the same description. The only real line in the sand we have between things we consider alive and things we consider tools is that things we consider alive are all born and descended from other living creatures, not assembled from found or fabricated parts.
Ultimately, though, this is a difference in origin, not a difference in quality or capability. Any tool, including electronic computing devices, can potentially have all of the same qualities as life if we could figure out how to make them self-assembling, self-healing, and self-reproducing. I guess we can do this with software, but it isn't obvious to me how to even demarcate a unit of "software" as an individual entity. How to demarcate intelligent from unintelligent software is even less clear, but nothing about the underlying state of matter the computations run on is any different, so I don't see how it involves anything we can call a phase transition without severely straining the term.
We are not anywhere close to creating a conscious AI. An AI without humanity is completely pointless unless it is capable of building a conscious AI which I doubt it is.
In the 1980s Gregory Benford explored this with his Galactic Center Saga books. I really enjoyed the series especially the middle one, Great Sky River.
That seems pretty likely - with some chance of hybrid still possible. That is, does AI take off and leave the goop in the dust. Or does AI become an augmentation of the current life forms - in an integrated form which perhaps can be admitted as continuation. The current AI products require quite a bit of compute power - but then augmentation doesn't need to be "on-board" the organic life form.
Except for AI not being life yet. I'll go with intelligence already, but not yet growing, reproducing, producing, interacting or much of anything you might choose for "life". Which is cool: a (to be) life form which starts with intelligence before life!
Makes you wonder what comes after AI. What's the "higher order" after AI that exists today or that will exist in 10 years? I'd guess we will never understand that level of intelligence, unless AI augments our brains somehow.
Even back in 2017, transistors were already smaller[0] and faster[1] than biological synapses to the same ratio by which wolves are smaller and faster than the hills they roam around on.
[0] Human synapse size / 2017-transistor size ≅ 1µm / 11nm ≅ wolf body length / a specific hill I was thinking of ≅ 1.6m / 145m [3]
[1] synaptic pulse rate / transistor transition rate ≅ 200 Hz / 30 GHz [2] ≅ 2 cm/year / 25 km/day ≅ speed of continental drift / average daily range of a wolf as a speed [3]
[2] transistors flip significantly faster than overall clock speed
Frankly, the more I think about AI, the less sense it makes to me that biological, single-body humans have any place in the future. As soon as we can digitize our minds, why wouldn't people begin to do so? Bodies could be inhabited at will, and death will be a thing of the past as we're able to store backups. I'm sure some will refuse and be left behind, just as we have Amish communities today, with a similar level of influence on civilization. And in the case of digital people, I think it's likely they'll share in the intellectual advancements of AI, if such a distinction even exists.
Digitization is one direction but I think augmentation is perhaps a more likely one. Or a first one. Digitization can follow in a "Ship of Theseus" fashion.
And augmentation branches then in AI as symbiont versus AI as desktop assistant.
> Digitization is one direction but I think augmentation is perhaps a more likely one. Or a first one.
First seems likely, but as a permanent alternative I don't know why a species would eschew lightspeed transportation and effortless immortality for the fragility and slowness of an organic body. It's possible there are good reasons, but I don't know of any.
Digitization, upload, has always seemed to me an iffy goal. The brains' packaging doesn't seem very amenable to any of our current technologies, as far as being able to "read" it. And then once read emulating it seems just as difficult.
Once uploaded comes the issue of getting computing time to run it (the economics and politics of prefering run time of X over run time of Y). And maintaining the computing platform. Certainly there are immense advantages to the digitized form - of course.
By contrast, augmentation (which is what we already do) seems straightforward. And seems to fit current society "easily" (haha - or let's say it will be tough enough as a first stage.)
So that from the point of view of a next epoch in life forms, AI fits more immediately in the struggle of AI as symbiont, AI as independent, or AI as desktop assistant.
Could be that what we experience as the universe is only a minor fraction of what exists. I define life as "processing information", so AI by definition is a life form given this definition.
No, because AI is (for now at least) shaped and constrained by us humans rather than developing free based on the laws of the universe. Is it really "evolution" if a judge can rule that it violates copyright and stop all progress overnight? Or a random developer can add a bit of code to make sure the answers appease the right set of people?
What we have today is a crude software approximation mimicking what we think AI should be, but that AI itself is nowhere in sight.
> shaped and constrained by us humans rather than developing free based on the laws of the universe
This logic doesn't hold. Humans are part of the universe and obey all its laws. It's arbitrary to say bacteria and bonobos and stone tools are naturally occurring but AI aren't. We distinguish them because we're conscious and we have the experience of choice, but to say our creations aren't natural to the universe implies that our consciousness is not a natural phenomenon.
It feels like you’re simply stating the predators and outside influences that are affecting AI’s evolution. Humans killed the dodo, maybe we kill the AI next
What makes you confident our evolution didn’t occur the same way? The “fossil records” of the two are similar in many ways: lots of baby steps, giant leap with no known intermediary states, lots of baby steps, …
And if we're building off a bad initial premise it weakens the whole argument. "AI could be evolving just like us!" doesn't make sense when we don't know how we evolved.
Yeah, a lot of people get hung up on the term AI as it exists today, and protesting that it doesn't deserve such consideration. I should have been more explicit that I was speaking in the much more general sense, and on an evolutionary timescale, not about technology we'd recognize today.
Could you elaborate? Do you mean the current state of AI?
I would argue that current models have some behaviors that one could liken to intelligence, even if it’s all just operations on 1s and 0s. Of course, this depends on your definition of intelligence. Mine is along the lines of “can develop a representation of a problem space and use that to predict optimal actions given current input”. Which current AI, most animals, fungi, and humans can do. Sentience is a different question, I’d argue that only humans and a few species of animal (Dolphins, Elephants, apes) are sentient as of now, though it seems highly possible that machines will join that group by the end of the century, if not sooner.
>Well it's true, or not, regardless of how any of us feel about it
Yes, but not with the same probabilities of being true in both cases (the cases being whether we feel good or bad about it).
Something makes it to HN because HNers like it. And not true things (feel good articles and popular sentiments) are more likely to be liked while not being true, compared to true but not likable stories.