Hacker Newsnew | past | comments | ask | show | jobs | submit | thethindev's commentslogin

I setup this web server on Android with just Termux!


> This is exactly the type of setup I'm interested on replicating on my end. Thanks a lot for the post!!

My pleasure!

> Are you planning on fine-tunning the LLM further for your own needs? Have you thought about that?

Initially, I wasn't thinking about it but it looks easy enough [1][2]!

[1] https://iwasnothing.medium.com/llm-fine-tuning-with-macbook-...

[2] https://github.com/ml-explore


I setup a simple homelab on a macbook air and added ollama + open webui. Now I have an LLM I can talk to whenever I'm out.

I'm using llama2 and llama2-uncensored, but I'm going to download llava and mixtral later. Either way, it's been a great experience


There's an online meetup for people in PST [1] scheduled for tonight. I didn't RSVP but I hope to be able to attend so I can show my home lab.

[1] https://events.indieweb.org/2024/03/homebrew-website-club-pa...


I actually did just that and setup my homelab with my "old" laptop [1].

[1] https://thin.computer/index.php/2024/03/08/self-hosting-my-s...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: