Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
keheliya
6 months ago
|
parent
|
context
|
favorite
| on:
DeepSeek releases Janus Pro, a text-to-image gener...
Running it in a MacBook Pro entirely locally is possible via Ollama. Even running the full model (680B) is possible distributed across multiple M2 ultras, apparently:
https://x.com/awnihannun/status/1881412271236346233
vessenes
6 months ago
|
next
[–]
That’s a 3 bit quant. I don’t think there’s a theoretical reason you couldnt run it fp16, but it would be more than two M2 Ultras. 10 or 11 maybe!
bildung
6 months ago
|
parent
|
next
[–]
Well there's the practical reason of the model natively being fp8 ;) One of the innovative ideas making it so much less compute-intensive, apparently.
rsanek
6 months ago
|
prev
[–]
the 70B distilled version that you can run locally is pretty underwhelming though
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: