Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There’s people who thought we could just wire up ChatGPT to a bunch of API calls and have AGI by now. Or some similar version of bootstrapping an LLM.


That could very well be the case.


No, because there are still other issues like context length and performance. AGI is not very useless if you get one token per hour, and when it forgets about what you talk about 10 minutes ago because it ran out of context




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: