LocalOps
Back to Calculator

Devstral 2 (123B)

Mistral AI's flagship coding agent at 123B parameters — SOTA on SWE-bench Verified. Same Ministral 3 architecture with vision support and 256K context. Requires 128GB RAM/VRAM.

Model Specifications

ArchitectureTEXT
Parameters123B
Familydevstral
VRAM (Q4)61.5GB
Modified MIT license with a $20M/month revenue cap restriction. Not fully permissive for large enterprises. Requires multi-GPU or large unified memory.
#mistral#coding#agentic#swe-bench#vision#flagshipSource

Estimated Quantization Sizes

FormatPrecisionEst. VRAMRecommendation
FP16 / BF1616-bit246.0 GBUncompressed Base
Q8_0High8-bit123.0 GBNear Lossless
Q6_K6-bit92.3 GBExcellent Balance
Q4_K_MPopular4-bit61.5 GBStandard Use

Share this Model

Send this model's specs directly to your community.

Post