Great to see more local AI tools supporting MCP! Recently I've also added MCP support to recurse.chat. When running locally (LLaMA.cpp and Ollama) it still needs to catch up in terms of tool calling capabilities (for example tool call accuracy / parallel tool calls) compared to the well known providers but it's starting to get pretty usable.
It's a protocol that doesn't dictate how you are calling the tool. You can use in-memory transport without needing to spin up a server. Your tool can just be a function, but with the flexibility of serving to other clients.
Are there any examples of that? All the documentation I saw seemed to be about building an MCP server, with very little about connecting an existing inference infrastructure to local functions.
If you are on a Mac, give https://recurse.chat/ a try. As simple as download the model and start chatting. Just added the new multimodal support in LLaMA.cpp.
Actually this is a good way to find product ideas. I placed a query in Grok to find posts about what people want, similar to this. Then it performs multiple searches on X including embedding search, and suggested people want stuff like tamagotchi, ICQ etc. back.
I feel like these are all great examples of things people think they want. Making a post about it is one thing, actually buying or using a product, I think the majority of nostalgic people will quickly remember why they don't actually want it in their adult life.
I see this a lot in vintage computing. What we want is the feelings we had back then, the context, the growing possibilities, youth, the 90s, whatever. What we get is a semi-working physical object that we can never quite fix enough to relive those experiences. But we keep acquiring and fixing and tinkering anyway hoping this time will be different while our hearts become torn between past and present.
"The specification reached an impasse: all interested implementors have used the same SQL backend (Sqlite), but we need multiple independent implementations to proceed along a standardisation path."
reply