I am legitimately confused at other people around me thinking that this exponentially evolving technological explosion will end at a steady state that will be at all familiar to us.
It's a worn-out metaphor, but I can't help but think of horses marvelling at this mechanical carriage thing, wondering what's the endgame of that.
But the biggest thing is that I just don't think that there is anything resembling a steady-state outcome. There's a really interesting book called "Singularities: Landmarks on the Pathways of Life" from (oh, wow, time flies) two decades ago, which describes in each of its chapters a major biological transition whereby a new "technology" was adapted by a particular organism, and then rapidly took over the whole of Earth's biosphere, by virtue of enabling it to either utilize its environment better or evolve more quickly or both. Some example topics covered are: ATP, RNA, Proteins, Membranes, Oxygen utilization, Multicellular organisms, etc. Each change being so dramatic that whatever came before almost loses relevance.
I'm concerned about the rise of AI being the next such singularity, after which we humans would become pretty much irrelevant to the state of the world. Not just out-of-the-job, but similar to horses (or better yet coral reefs), living in some specific ecological niches, away from where the "interesting" stuff happens, with our existence in peril due to forces out of our control.
> Not just out-of-the-job, but similar to horses (or better yet coral reefs), living in some specific ecological niches, away from where the "interesting" stuff happens.
interesting, i could see that happening... one does wonder to what end ai is being developed so aggresively.
> our existence in peril due to forces out of our control
i think i understand what you are saying
while its a different thing i think there are already a lot of systems in the world that collectively slipping out of our control, even if nominally we have agency (see climate change, or social media, market bubbles etc)
just my personal bias, but i feel that if ai "takes over" so-to-speak its in a long line of succession of our technologies controlling us instead of the other way around... idk does that make sense? probably im not contributing much to this conversation; just my thoughts...