Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs are a great new interaction paradigm for interrogating a corpus of documents, but not such a good way of finding which documents to interrogate (and unless you do, it's not all too useful for information retrieval).

The real magic happens when you stick the two together. Let traditional search find the relevant documents, and then interact with them through a LLM. This isn't a shortcoming in model tuning or context window size.

The way I see it, recent AI improvements make the future brighter for new search engines. In a gold rush, there are two types of winners, you can win by beating the rest to the gold, or you can win by being the guy selling maps and pickaxes. Traditional search indices fall squarely in the second camp.

AI actually opens new avenues for profit for traditional search, since while it's notoriously difficult to monetize an internet search engine, suddenly you can make ends meet selling API access to AI start-ups.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: