Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Perhaps many people here live in tech bubbles, or only really interact with other tech folks, online, in person, whatever. People in tech are relatively grounded about LLMs. Relatively being key here.

On the ground in normal people society, I have seen that people just treat AI as the new fountain of answers and aren't even aware of LLM's tendency to just confidently state whatever it conjures up. In my non-tech day to day life, I have yet to see someone not immediately reference AI overview when searching something. It gets a lot of hostility in tech circles, but in real life? People seem to love it.






They do love it. I have been, nicely and as helpfully as I can, educating people on the nature of LLM tools.

I personally have little hostility toward the AI search results. Most of the time, the feature nails my quick search queries. Those are usually on something I need a detail filled in due to forgetting said detail, or a slightly different use case where I am already familiar enough to catch gaffes.

Anything else and I typically ignore it and do my usual search elsewhere, or fast scroll down to the worthy site links.


And this is why we can't just rely on awareness of these issues - we need to also hold companies accountable for false information.

I mentioned hallucinations last week on a call with 2 seasoned marketers and both thought I invented the term on the spot.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: