> If you can put in ALL of the information relevant to a problem and it can algorithmically do reasoning
Well but it can't, can it? That's the very point: it is an algorithm with no abstraction or reasoning capabilities. Mind, even without that it is still very useful for certain circumscribed purposes, but it's incorrect to say it can "reason".
Q: List possible abstractions encompassing all these specific concepts: singing, whistling, yodeling.
A: One possible abstraction that encompasses all these specific concepts is "vocalization." Vocalization refers to the act of producing sound using the vocal cords and the mouth. Singing, whistling, and yodeling are all different types of vocalization, but they share the common feature of using the human voice to create sound. Other possible abstractions that might encompass these concepts could include "musical expression," "auditory performance," or "sound production."
That's literally a text search task. That's not what I mean, I mean things like understanding the rules of addition from examples, the rules of logic from examples, or the rules of chess.
According to [1], they trained an LLM on legal Othello moves, and 1) it got an error rate of 0.01% 2) when they analyzed its network, they found a model of an Othello board 3) when they modified the in-network model, it started generating moves legal in the modified board position.
In other words, the LLM did build an internal model that contained the rules of Othello merely from seeing legal moves. It's reasonable to assume that the same thing is happening (at least to some degree) with LLMs based on human speech.
It can't search text. It doesn't have access to any text. Anything it does works in a different way than that.
It is sometimes able to do other tasks, but unlike humans (or "AGI") it has a completely fixed compute budget and can't pause to think in between outputting two tokens.
(Btw, I tried to get it to derive addition from two 1-digit examples but couldn't.)
Well but it can't, can it? That's the very point: it is an algorithm with no abstraction or reasoning capabilities. Mind, even without that it is still very useful for certain circumscribed purposes, but it's incorrect to say it can "reason".