The Local LLM Hardware Calculator estimates how much GPU VRAM and system RAM you need to run any open-source language model on your own hardware. Select a model size and quantization level to see which consumer and professional GPUs can handle it.

Data as of April 2026

Model Configuration

Hardware Requirements

Configure model and click Calculate

GPU Compatibility

Run a calculation to see GPU compatibility.