Hacker News new | past | comments | ask | show | jobs | submit login

Two questions:

1) Which search engine comes with infallible information? 2) Where are LLMs being sold as something different?






1) Current (traditional) search engines are indexes. They point to sources which can be read, analyzed and summarized by the human into information. LLM do the read, analysis and summarization part for the human.

2) chatbots, perplexity search engine, summarization chrome extensions, RAG tools. Those all built over the idea that hallucination is a quirk, a little cog in the machine, a minor inconvenience to be dutifully noted (for legal reasons) but conveniently underestimated.

Most things in life don’t have a compiler that will error on a inexistent python package.


> LLM do the read, analysis and summarization part for the human

No they don't. The human is meant to read, analyze and summarize the output same as they would for search results




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: