LocalOps LogoLocalOps
Back to Calculator

Trinity Large

Arcee AI's frontier 400B sparse MoE with 13B active params per token. One of the largest Apache 2.0 models ever from a US lab. Excels at creative writing, chat, tool use, and long-horizon agentic workflows.

Specifications

Source
ArchitectureTEXT
Parameters400B
Familytrinity
VRAM (Q4)200.0G
MoE: 13B active.
arceemoeagentfrontierapache2popular

Run in the Cloud

This model requires enterprise-grade VRAM. Rent GPUs on RunPod and start generating.

Deploy on RunPod

Instant Cloud GPUs

Running out of VRAM? Rent a high-end H100 or RTX 4090 on RunPod and deploy in seconds.

Deploy Now

Quantization Estimates

FormatVRAM NeedTier
FP16800.0 GBFull Precision
Q8_0400.0 GBHigh
Q6_K340.0 GBExcellent
Q5_K_M280.0 GBGreat
Q4_K_M200.0 GBSweet Spot
Q2_K120.0 GBEmergency

Share this Model

Send these specs directly to your community.

Post