Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But then it makes an obvious mistake and you correct it and it says "you are absolutely right". Which is fine for that round but you start doubting whether its just sycophancy.


You're absolutely right! its just sycophancy.


Yeah I've learned to not really trust it with anything opinionated. Like "whats the best way to write this function" or "is A or B better". Even asking for pros/cons, its often wrong. You need to really only ask LLMs for verifiable facts, and then verify them


If you ask for sources the output will typically be either more correct, or you will be able to better assess the source of the output.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: