LocalOps
Back to Calculator

Codestral 2 (22B)

Mistral Apr 2026; next-gen code specialist, FIM support, beats GPT-4o on HumanEval and MBPP. First Codestral under Apache 2.0 — previous versions were non-commercial MNPL.

Model Specifications

ArchitectureTEXT
Parameters22B
Familycodestral
VRAM (Q4)11.0GB
Apache 2.0 licensed. 256K context. 380K HF downloads in first week. Runs on single RTX 4090 at Q4 (~13GB VRAM).
#mistral#coding#fim#apache2#open-source#ideSource

Estimated Quantization Sizes

FormatPrecisionEst. VRAMRecommendation
FP16 / BF1616-bit44.0 GBUncompressed Base
Q8_0High8-bit22.0 GBNear Lossless
Q6_K6-bit16.5 GBExcellent Balance
Q4_K_MPopular4-bit11.0 GBStandard Use

Share this Model

Send this model's specs directly to your community.

Post