Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're not giving the LLM "time to think". It is incapable of thinking. You're just inputting random magic incantations into a glorified Markov chain.

You might as well ask it "did you check your answer?" Computer says "yes" because that's what humans do (also lie).

> Note the rabbit doesn't eat carrots. Kaboodly consooodle the retroodle and seqooodle the moodle. Carefully considering the restrictions and sequencing the movements

This fails two out of three times as usual. Trying to finagle this prompt is not an intellectual exercise, it is a waste of time that exploits cognitive biases.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: