LocalOps
Back to Calculator

Mistral Large 3 768B

Mistral's flagship MoE model, a massive 768B parameter model for enterprise-grade tasks

View API Documentation

Model Specifications

ArchitectureTEXT
Parameters768B
Familymistral
VRAM (Q4)API Only
Mixture of ExpertsActive inference parameters: 64B.
#mistral#flagship#moe#enterpriseSource

Share this Model

Send this model's specs directly to your community.

Post