LocalOps
Back to Calculator

Llama 4 Maverick

Hot

Meta's flagship open-weight MoE with 128 experts and 1M context, distilled from Llama 4 Behemoth

Model Specifications

ArchitectureVISION
Parameters400B
Familyllama
VRAM (Q4)200.0GB
Mixture of ExpertsActive inference parameters: 17B.
Available in BF16 and FP8 quantized weights
#meta#moe#multimodal#flagship#trendingSource

Share this Model

Send this model's specs directly to your community.

Post