LocalOps LogoLocalOps
Back to Calculator

Trinity Mini

Arcee AI's 26B sparse MoE with 3B active params per token. Fully post-trained reasoning model for web apps and agentic tasks. Apache 2.0.

Specifications

Source
ArchitectureTEXT
Parameters26B
Familytrinity
VRAM (Q4)13.0G
MoE: 3B active.
arceemoeagentreasoningapache2

Build your Local Rig

Ready to run locally? Shop top-tier GPUs on Amazon for the best performance.

Instant Cloud GPUs

Running out of VRAM? Rent a high-end H100 or RTX 4090 on RunPod and deploy in seconds.

Deploy Now

Quantization Estimates

FormatVRAM NeedTier
FP1652.0 GBFull Precision
Q8_026.0 GBHigh
Q6_K22.1 GBExcellent
Q5_K_M18.2 GBGreat
Q4_K_M13.0 GBSweet Spot
Q2_K7.8 GBEmergency

Share this Model

Send these specs directly to your community.

Post