Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The comparison between the context length and what humans can hold in their heads just seems faulty.

I'm not sure I can agree that humans cannot hold 25,000 words worth of information in their heads. For the average person, if they read 25,000 words, which can be done in a single sitting, they're not going to remember all of it, for sure, but they would get a lot out of it that they could effectively reason with and manipulate.

Not to mention that humans don't need to hold the entire report in their head because they can hold it in their hand and look at it.

And if anything, I think it's more significant to have a bigger working memory for GPT's own outputs than it is for the inputs. Humans often take time to reflect on issues, and we like to jot down our thoughts, particularly if it involves complex reasoning. Giving something long, careful thought allow us to reason much better.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: