Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> An 8GB Macbook could use Gemma 3 4B at 2.5GB+1GB, but this is probably not worth doing.

I am currently using this model on a Macbook with 16GB ram, it is hooked up with a chrome extension that extracts text from webpages and logs to a file, then summarizes each page. I want to develop an episodic memory system, like MS Recall, but local, it does not leak my data to anyone else, and costs me nothing.

Gemma 3 4B runs under ollama and is light enough that I don't feel it while browsing. Summarization happens in the background. This page I am on is already logged and summarized.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: