Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right? Like I had a batshit insane conversation about song lyrics the other night, where the chatbot repeatedly generated patently incorrect responses - close enough to seem reasonable, but utterly incorrect, mistakes that I can’t imagine a human making, just straight up false statements that didn’t hold up to the slightest scrutiny. Incredibly frustrating. Imagine having that kind of experience with a medical professional, when you’re sick and impatient to receive care. awful.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: