Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> A search result might take me to the wrong answer but an LLM might just invent nonsense answers

> This is a fundamentally different thing and is more difficult to detect imo

99% of the time it's not. You validate and correct/accept like you would any other suggestion.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: