Closed
Description
First open LLM from @SnowflakeDB! Arctic is 480B Dense-MoE with a 10B dense transformer model and a 128x3.66B MoE MLP designed specifically for enterprise AI. π€
TL;DR:
π§ 480B parameters with 17B active during generation
π¨βπ« 128 experts with 2 active in generation
2οΈβ£ Instruct & Base versions released
ποΈ Focused on Enterprise task (Code, SQL, Reasoning, Following)
π Released under Apache 2.0
π» in fp16 ~900GB Memory & in int4 ~240GB
π€ Available on @huggingface
ππ» Trained with DeepSpeed-MoE
Models: https://huggingface.co/Snowflake/snowflake-arctic-instruct