Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This whole post can be (unintentionally) TL;DRed by a single Wikipedia article: https://en.m.wikipedia.org/wiki/AI_effect

The phenomenon of people discounting significant advances as "not AI" is not new. I'm surprised it hasn't gotten old yet.



> The phenomenon of people discounting significant advances as "not AI" is not new. I'm surprised it hasn't gotten old yet.

It hasn't gotten old because people keep calling non-AI AI. I have so far refrained from correcting people because I see it like the word "hacker", which on Hacker News is widely understood in its correct meaning, but is commonly used to refer to computer criminals. I've given up long ago on correcting people to use the word "crackers" rather than "hackers". But since it has been brought up...

Artificial Intelligence refers to something that behaves intelligently and can adapt to new situations. I don't think human brains are magic or divine (I was surprised to learn this is not a universal opinion), so in that sense we're all "just a computation". In that sense the simplest pocket calculator is a thinking machine since it can take an almost infinite number of inputs and respond to it logically. But everyone with (artificial) intelligence understands that this is not what is meant by artificial intelligence.

The article that you link claims that every major advance is referred to as "just a computation", but I doubt that people will claim AI is "just a computation" once a computer learns to speak and can have a real conversation at the level of a 3 year old or something.


The term AI researchers have used to describe what you are talking about is Artifical General Intelligence. And they've made a distinction because there is no reason an advancement over human capabilities shouldn't be called AI. Sure, learning how to play Go better than any human alive doesn't mean that AlphaGo can learn and interpret languages, but that doesn't mean it's not intelligent. Even human intelligence is not homogenous in its capabilities.

It's obviously speculative, but I'm of the belief that AGI will never exist the way fanciful minds imagine it. Rather AI will evolve slowly, and systems will incorporate small advances one by one until one day we don't really care to distinguish between AI and AGI. It will literally just be a collection of capabilities ("just a computation") that are well understood and have lost all semblance of magic.


like the word "hacker", which on Hacker News is widely understood in its correct meaning

You'd be surprised. Here's part of just such a conversation for a couple of years ago; I've always thought "hacker" without qualifiers meant "person who can code"

https://news.ycombinator.com/item?id=9790316

Other terms commonly used here with wildly different meanings being applied include "liberal", "feminism" and "capitalism". Screaming matches abound... :)


I was thinking the same while typing that (that due to being somewhat mainstream, "Hacker" News might not be understood for what it is). I personally use the catb.org definition[1], which is quite broad and lists 8 definitions -- if you count the 'deprecated' use of referring to 'computer criminal', which I do count as one of the popular definitions.

[1] http://catb.org/jargon/html/H/hacker.html


If there's an "AI effect" surely its inverse also exists; it seems to me that every time there is a [relatively] major advancement, a similar chorus of supporters pop up to tell us how we're so close to solving every AI problem imaginable. In recent memory there's Google's NMT, which had people effectively saying it's "solved" translation, but when I got to actually feed something into it, as a competent speaker of Japanese and former translator, I can only say that I was wholly unimpressed, relative to the hype at least.

Discounting AI advances is nothing new, but upselling them is at least as old, as well.


I completely agree. Deep learning has been doing this most recently and I'd argue that the very existence of AI winters is really just deflated expectations that always were overly ambitious.

My own way to look at AI winters is as triumph of Scruffy pragmatism over Neat delusions. Intelligence is unimaginably complicated, there is no reason to believe that silver bullet algorithms even exist, let alone the probability that the latest fad is it.





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: