- 109B Scout 17B*16 MoE
- 400B Maverick 17B*128 MoE
- 2T Behemoth 288B * 16 MoE
The Llama 4 herd: The beginning of a new era of natively multimodal AI innovation
We’re introducing Llama 4 Scout and Llama 4 Maverick, the first open-weight natively multimodal models with unprecedented context support and our first built using a mixture-of-experts (MoE) architecture.
https://ai.meta.com/blog/llama-4-multimodal-intelligence/


Seonglae Cho