LocalOps
Back to Calculator

MiMo V2 Pro

Hot

Xiaomi MiMo's 1 trillion parameter MoE reasoning model with 384 experts, revealed as "Hunter Alpha" on OpenRouter in March 2026

Model Specifications

ArchitectureTEXT
Parameters1000B
Familymimo
VRAM (Q4)500.0GB
Mixture of ExpertsActive inference parameters: 32B.
1T total with 32B active, initially revealed anonymously on OpenRouter as Hunter Alpha
#xiaomi#moe#reasoning#trendingSource

Estimated Quantization Sizes

FormatPrecisionEst. VRAMRecommendation
FP16 / BF1616-bit2000.0 GBUncompressed Base
Q8_0High8-bit1000.0 GBNear Lossless
Q6_K6-bit750.0 GBExcellent Balance
Q4_K_MPopular4-bit500.0 GBStandard Use

Share this Model

Send this model's specs directly to your community.

Post