LocalOps

Vision Model

Can I Run CogVLM2 19B Locally?

Powerful vision-language

System Configuration

Configure your hardware to check compatibility

VRAM12GB
Bandwidth504 GB/s
TDP285W
System RAM32GB
Typededicated

Compatibility Result

Based on your selected hardware

Runs with Offload
VRAM Usage24GB / 12GB
Est. SpeedN/A (Non-Text)
Context (KV)
N/A
Disk Space
24 GB
50% of layers will be offloaded to system RAM. This will significantly reduce generation speed.

Similar Models

Llama 4 Behemoth

2000B

Flagship 2T foundation model, 16 experts

flagshipmeta

Mistral Large 3

675B

Granular MoE flagship, 256K context

flagshipmistral

Mistral Large 3 NVFP4

675B

FP4 quantized version for NVIDIA NIM

flagshipmistral