LocalOps
Back to Calculator

Gemma 4 26B A4B

Hot

Google's Gemma 4 MoE model — 26B total parameters with only 4B active, runs almost as fast as a 4B model. #6 open model on Arena AI

Model Specifications

ArchitectureVISION
Parameters26B
Familygemma
VRAM (Q4)13.0GB
Mixture of ExpertsActive inference parameters: 4B.
MoE architecture. 256K context. LMArena score 1441. Apache 2.0
#google#gemma#moe#multimodal#reasoning#flagship#trendingSource

Share this Model

Send this model's specs directly to your community.

Post