Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yea, for any hobbyist, indie developer, etc. I think it'd be ridiculous to not first try running one of these smaller (but decently powerful) open source models on your own hardware at home.

Ollama makes it dead simple just to try it out. I was pleasantly surprised by the tokens/sec I could get with Llama 3 8B on a 2021 M1 MBP. Now need to try on my gaming PC I never use. Would be super cool to just have a LLM server on my local network for me and the fam. Exciting times.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: