Hacker News new | past | comments | ask | show | jobs | submit login

Everyone has a very different idea of what the word "intelligence" means; this definition has got the advantage that, unlike when various different AI became superhuman at arithmetic, symbolic logic, chess, jeopardy, go, poker, number of languages it could communicate in fluently, etc., it's tied to tasks people will continuously pay literally tens of trillions of dollars each year for because they want those tasks done.



This definition alone might be fine enough if the word "intelligence" wasn't already widely used outside of AI research. It is though, and the idea that intelligence is measured solely through economic value is a very, very strange approach.

Try applying that definition to humans and you pretty quickly run into issues, both moral and practical. It also invalidates basically anything we've done over centuries considering what intelligence is and how to measure it.

I don't see any problem at all using economic value as a metric for LLMs or possible AIs, it just needs a different term than intelligence. It pretty clearly feels like for-profit businesses shoehorning potentially valuable ML tools into science fiction AI.


> This definition alone might be fine enough if the word "intelligence" wasn't already widely used outside of AI research. It is though, and the idea that intelligence is measured solely through economic value is a very, very strange approach.

The response from @s1mplicissimus' on my previous comment is asking about "common usage" definitions of intelligence, and this is (IMO unfortunately) one of the many "common usage" definitions: smart people generally earn more.

I don't like "commmon sense" anything (or even similar phrases), because I keep seeing the phrase used as a thought-terminating cliché — but one thing it does do, is make it not "a very, very strange approach".

Wrong, that happens a lot for common language, but it can't really be strange.

> Try applying that definition to humans and you pretty quickly run into issues, both moral and practical.

Yes. But one also runs into issues with all definitions of it that I've encountered.

> It also invalidates basically anything we've done over centuries considering what intelligence is and how to measure it.

Sadly, not so. Even before we had IQ tests (for all their flaws), there's been a widespread belief that being wealthy is the proof of superiority. In theory, in a meritocracy, it might have been, but in practice not only to we not live in a meritocracy (to claim we do would deny both inheritance and luck), but also the measures of intelligence that society has are… well, I was thinking about Paul Merton and Boris Johnson the other day, so I'll link to the blog post: https://benwheatley.github.io/blog/2024/04/07-12.47.14.html


> smart people generally earn more.

> there's been a widespread belief that being wealthy is the proof of superiority.

Both of these are assumptions though, and working in the reverse order. Its one thing to expect that intelligence will lead to higher value outcomes and entirely different to expect that higher value outcomes prove intelligence.

It seems reasonable that higher intelligence, combined with the incentives if a capitalist system, will lead to higher intelligence people getting more wealthy. They learn to play the game and find ways to "win."

It seems unreasonable to assume that anyone or anything that "wins" in that system much be more intelligent. Said differently, intelligence may lead to wealth but wealth doesn't imply intelligence.


I think we're in agreement? I'm saying their measure in this case is no worse than any other, but not that it's a fundamental truth.

All the other things — chess, Jeopardy, composing music, painting, maths, languages, passing medical or law degrees — they're also all things which were considered signs of intelligence until AI got good at them.

Goodhart's law keeps tripping us up on the concept of intelligence.


> I think we're in agreement? I'm saying their measure in this case is no worse than any other, but not that it's a fundamental truth.

Maybe we are? I think I lost the thread a bit here.

> chess, Jeopardy, composing music, painting, maths, languages, passing medical or law degrees

That's interesting, I would have still chalked skill in those areas as a sign of intelligence and didn't realize most people wouldn't once AI (or ML) could do it. To me an AI/LLM/ML being good at those is at least a sign that they have gotten good at mimicking intelligence if nothing else, and a sign that we really are getting out over our skis risking these tools without knowing how they really work.


Maybe by the time it’s doing a trillion dollars a year of useful work (less than 10 years out) people will call it intelligent… but still probably not.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: