Back to Calculator
GLM-4.6V Flash
Z.ai's free-tier compact VLM at 9B parameters with 128K multimodal context and native Function Call support. Fully open-weight for commercial use.
Model Specifications
ArchitectureVISION
Parameters9B
Familyglm
VRAM (Q4)4.5GB
Free for commercial use. Integrated with GLM Coding Plan and dedicated MCP tools.
Share this Model
Send this model's specs directly to your community.
Similar Models
Related Guides
How much VRAM do you really need?
A complete breakdown of quantization levels and VRAM overhead for running local models.
Best GPUs for Machine Learning in 2026
Comparing NVIDIA and AMD options for the best speed-to-dollar ratio.
GGUF vs EXL2 vs AWQ
Understanding local AI formats and which one to pick for your specific hardware.