Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can use it to feed extra context in, similar to RAG but allowing the LLM to "decide" what information it needs. I think it's mostly useful in situations where you want to add content that isn't semantically related, and wouldn't RAG well.

E.g. if I were making an AI that could suggest restaurants, I could just say "find a Mexican restaurant that makes Horchata", have it translate that to a tool call to get a list of restaurants and their menus, and then run inference on that list.

I also tinkered with a Magic: The Gathering AI that used tool calling to get the text and rulings for cards so that I could ask it rules questions (it worked poorly). It saves the user from having to remember some kind of markup to denote card names so I can pre-process the query.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: