Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I ask my LLM how to plan and commit a crime, it should do that. It should not say “sorry, that is outside my current scope”, because that’s not what I asked it to do.

The LLM is being incorrect at this point, because it is not predicting the next token accurately anymore.

Politics is not nonsense. You are the one speaking nonsense by suggesting that someone else should have the right to control what you can say to a machine.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: