Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why is giving precise instructions bad? I would expect LLMs to be pretty good at following instructions after three years of training them that way. Plus, if the instructions are precise enough and therefore each step is simple enough, I would expect everything it needs to do to be 'in-distribution'.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: