Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

https://ollama.com/library/devstral

https://ollama.com/

I believe its just a HTTP wrapper and terminal wrapper around llama.cpp with some modifications/fork.



Does ollama have support for cpu offloading?



A perfect blend of LMGTFY and helpfulness. :)


lol. I try not to be a total asshole, it sometime even works! :)

Good luck to you mate with your life :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: