Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not on the Ollama side.

This sample code shows how a sample implementation of a tool like `get_current_weather` might look like in Python:

https://github.com/ollama/ollama-python/blob/main/examples/t...



Ah, I see. The model returns the name of an appropriate tool, then the client takes arbitrary action, and appends the `tool` message to the chat context, and finally a second call to the model minges these together.

Part of me was hoping for some magic plugin space where I could drop named functions, but I couldn't imagine how.


You specify the API of your functions (inputs/description).

The LLM will decide which functions to call and with what values.

You perform the actual function execution.


Thanks.

If their announcement had included the schema of the return value of `response['message']['tool_calls']` it might've been more transparent to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: