Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The LLMs were never designed to be fact checkers, they're text generators, not fact generators.

Additionally you don't need ML to get your computer to show you confidently something that is completely wrong, all it takes is to multiply specific floating point numbers repeatedly, really.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: