LocalOps
Back to Calculator

Devstral Small 2

Mistral AI's updated coding agent — 24B parameters, Apache 2.0, 256K context window with vision support. Mistral Vibe CLI included. Built on Ministral 3 architecture with Scalable-Softmax.

Model Specifications

ArchitectureTEXT
Parameters24B
Familydevstral
VRAM (Q4)12.0GB
Best used with OpenHands or Mistral Vibe CLI. Current llama.cpp/Ollama implementations may have reduced accuracy — vLLM strongly recommended.
#mistral#coding#agentic#apache2#swe-bench#visionSource

Estimated Quantization Sizes

FormatPrecisionEst. VRAMRecommendation
FP16 / BF1616-bit48.0 GBUncompressed Base
Q8_0High8-bit24.0 GBNear Lossless
Q6_K6-bit18.0 GBExcellent Balance
Q4_K_MPopular4-bit12.0 GBStandard Use

Share this Model

Send this model's specs directly to your community.

Post