Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can directly ask it whether it is capable of reasoning and it tells you it's not, and that it's just a language model that is not capable of reasoning or self improvement or something along those lines.

Another example, ask it for a list of programming languages that it has been trained on. If it was capable of reasoning it would be able to trivially answer this, but since its a language model, and it just predicts the most likely response based on the prompt, it has no concept of this at all, and tells you exactly that when asked.



Well, this is exactly it. We are being told to believe that the end of knowledge work is nigh but yet this thesis is built on a bed of nonsense. It is just hand-wavy science fiction that even some academics with impressive pedigrees are promulgating. I think it is irresponsible as there is no sensible dialogue going on so you have students freaking out and will decide what not to study based on this and people making career decisions based on this misinformation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: