Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You would need an infinite context or compression

Only if AGI would require infinite knowledge, which it doesn’t.



You're right, but compounding effects get out of hand pretty quickly. There's a certain point where finite is not meaningfully different than infinite and that threshold is a lot lower than you're accounting for. There's only so much compression you can do, so even if that new information is not that large it'll be huge in no time. Compounding functions are a whole lot of fun... try running something super small like only 10GB of new information a day and see how quickly that grows. You're in the TB range before you're half way into the year...


This seems kind of irrelevant? Humans have General Intelligence while having a context window of, what, 5MB, to be generous. Model weights only need to contain the capacity for abstract reasoning and querying relevant information. That they currently hold real-world information at all is kind of an artifact of how models are trained.


  > Humans have General Intelligence while having a context window
Yes, but humans also have more than a context window. They also have more than memory (weights). There's a lot of things humans have besides memory. For example, human brains are not a static architecture. New neurons as well as pathways (including between existing neurons) are formed and destroyed all the time. This doesn't stop either, it continues happening throughout life.

I think your argument makes sense, but is over simplifying the human brain. I think once we start considering the complexity then this no longer makes sense. It is also why a lot of AGI research is focused on things like "test time learning" or "active learning", not to mention many other areas including dynamic architectures.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: