LocalOps
Back to Calculator

GLM Z1 9B

Z.ai's compact open-source reasoning model at 9B parameters, Apache 2.0. Part of the Z.ai open-source 9B/32B series covering base, reasoning, and rumination variants.

Model Specifications

ArchitectureTEXT
Parameters9B
Familyglm
VRAM (Q4)4.5GB
Open-sourced alongside GLM-Z1-32B. Good for edge/local reasoning deployment.
#zhipu#reasoning#efficient#apache2#open-sourceSource

Estimated Quantization Sizes

FormatPrecisionEst. VRAMRecommendation
FP16 / BF1616-bit18.0 GBUncompressed Base
Q8_0High8-bit9.0 GBNear Lossless
Q6_K6-bit6.8 GBExcellent Balance
Q4_K_MPopular4-bit4.5 GBStandard Use

Share this Model

Send this model's specs directly to your community.

Post