Wow. Spoken like someone who hopes (believes they deserve to) profit from the evolving technology, who views anyone not like them with bemusement and detached curiosity or, more likely, derision. Why do we - humans - feel it is so damn well appropriate to outsource our responsibilities and accountabilities as humans and integral members off an ecology to technology and, in doing so, forego a necessary immersion into and deep reverence for the world, substituting instead a tech-derived and mediated superficiality, detaching ourselves from our biology mostly for the sake of self-gratification and self-grandeur? The bigger question is, what values do we - the collective we - attribute to a world and a life to be saved and will our AI adhere to such values?
> Why do we - humans - feel it is so damn well appropriate to outsource our responsibilities and accountabilities as humans and integral members off an ecology to technology and, in doing so, forego a necessary immersion into and deep reverence for the world, substituting instead a tech-derived and mediated superficiality, detaching ourselves from our biology mostly for the sake of self-gratification and self-grandeur?
I mean, the simple and inelegant answer is evolution. Maximize mating opportunity and minimize energy expenditure. Grandeur means mating opportunity. Passing off responsibility means minimizing energy expenditure.
Humans aren't transcendent beings. We're just good at math.
> The bigger question is, what values do we - the collective we - attribute to a world and a life to be saved and will our AI adhere to such values?
Ask different groups of people and you'll get different answers. I don't know that there are "human" values.
> The bigger question is, what values do we - the collective we - attribute to a world and a life to be saved and will our AI adhere to such values?
I think to believe that we even know the answer to that question is high arrogance. To believe that we know all and should make the world conform to it is the height of hubris. The same sort seen in repeated failures of megalomaniacal central planning. We don't even know what we want ahead of time.
Not to mention that our solutions have all proven emergent based upon other properties rather than based upon some lofty principles. "Making Wheat Shorter" as a goal to prevent famines sounds like something straight out of a Jonathon Switft style satire but more or less sums up Norman Borlaug's high yield disease-resistant dwarf wheat. But that is what goddamned worked to save billions from starvation. Not going based upon assumptions of "oneness and harmony with the world" or whatever nonsense we are accused of lacking by do-nothings.
Almost as arrogant as believing in a "necessary immersion into and deep reverence for the world" or said lofty sentiments actually amount or or means anything. You get derision because you appeal to lofty sentiment which conveniently amounts to "do nothing but feel superior and look down upon others".
I’m honestly hoping AI lets us transcend or at least fully control biology. It sucks being trapped in a bag of meat you have no control over whose only optimized function is to reproduce and then die. No thanks, I’ve got more important shit to do.