Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've intuitively felt that this general class of task is what these LLMs are absolutely best at. I'm not an expert on these things, but isn't this thanks to word embeddings and how words are mapped into high dimensional vector space within the model? I would imagine that because every word is mapped this way, finding a word that exists in the same area as mail, lawyer, log, and line in some vector space would be trivial for the model to do, right?


More than just words. I've found LLMs immensely helpful for searching through the latent space or essence of quotes/books/movies/memes. I can ask things like "whats that book/movie set in X where Y happens" or "whats that quote by a P which goes something like Q" in my own paraphrased way and with a little prodding, expect the answer. You'd have no luck with traditional search engines unless someone has previously asked a similar question.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: