Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
tosh
on April 24, 2024
|
parent
|
context
|
favorite
| on:
Snowflake Arctic Instruct (128x3B MoE), largest op...
It is both cost efficient in training (+ future fine-tuning) as well as inference compared to most other current models.
Can you elaborate?
rajhans
on April 24, 2024
|
next
[–]
We have published some insights here.
https://medium.com/snowflake/snowflake-arctic-cookbook-serie...
ru552
on April 24, 2024
|
prev
[–]
the unquantized model is almost 1tb in size and the benchmarks provided by Snowflake shows performance in the middle of the pack compared to other recent releases.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
Can you elaborate?