Hacker News new | past | comments | ask | show | jobs | submit login

The term is actually fine. The problem is when it's divorced from the reality of:

> in some sense, hallucination is all LLMs do. They are dream machines.

If you understand that, then the term "hallucination" makes perfect sense.

Note that this in no way invalidates your point, because the term is constantly used and understood without this context. We would have avoided a lot of confusion if we had based it on the phrase "make shit up" and called it "shit" from the start. Marketing trumps accuracy again...

(Also note that I am not using shit in a pejorative sense here. Making shit up is exactly what they're for, and what we want them to do. They come up with a lot of really good shit.)




I agree with your point, but I don't think anthropomorphizing LLMs is helpful. They're statistical estimators trained by curve fitting. All generations are equally valid for the training data, objective and architecture. To me it's much clearer to think about it that way versus crude analogies to human brains.


We can't expect end users to understand what "statistical estimators trained by curve fitting" means.

That's why we use high level terms like hallucination. Because it's something everyone can understand even if it's not completely accurate.


> Because it's something everyone can understand

But they will understand the wrong thing. Someone unfamiliar with LLMs but familiar with humans will assume, when told that LLMs 'hallucinate', that it's analogous to a human hallucinating, which is dangerously incorrect.


That's a good point. But re: not anthropomorphizing, what's wrong with errors, mistakes or inaccuracies? That's something everybody is familiar with and is more accurate. I'd guess most people have never actually experienced a hallucination anyway, so we're appealing to some vague notion of what that is.


> what's wrong with errors, mistakes or inaccuracies?

They're not specific enough terms for what we're talking about. Saying a lion has stripes is an error, mistake, or inaccuracy. Describing a species of striped lions in detail is probably all those things, but it's a distinctive kind of error/mistake/inaccuracy that's worth having a term for.


> I'd guess most people have never actually experienced a hallucination anyway

I actually think most people have.

Every time you look at a hot road and see water that mirage is a form of hallucination.


Except mirages are real optical phenomena that can be captured by a camera. Hallucinations are made entirely by your brain and cannot be captured by an external observer.


> what's wrong with [']errors['], [']mistakes['] or [']inaccuracies[']?

"To sort the files by beauty, use the `beautysort` command."




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: