Humans are not generally intelligent. The adjective "general" in "AGI" does not mean it is equivalent to human intelligence, it means it's above and beyond human intelligence.
I think “general” should be taken to mean “has an average human child’s common sense and causal reasoning,” since common sense and causal reasoning are at some level shared by all vertebrates. It seems like the focus on “above and beyond human intelligence” is how you get AIs which appear to understand algebraic topology, yet utterly fail at counting problems designed for pigeons. It should be scientific malpractice to compare an AI to human intelligence without making any effort to compare it to rat/etc intelligence. (I guess investors wouldn’t lie happy if Sam Altman said “in 20 years I believe we’ll reach ARI.”)
In general tech folks are far too beholden to an instinctual and unscientific idea of intelligence as compared between humans, which mostly uses linguistic ability and surface knowledge as a proxy. This proxy might sometimes be useful in human group decision-making, but it is also how dumb confident people manage to fail upwards, and it works about as well for a computer as it does a rat (though it mismeasures in the opposite direction).
Not at all. Humans are fundamentally limited by our finite statespace and bandwidth. Classifying systems that are able to generalize at least as well as a human but that can exceed those limits as superintelligent is a meaningful distinction.
I agree that "equivalent to human intelligence" is not a robust way to define general intelligence, but humans are a general intelligence.