Local AI · No cloud · Your hardware
Find the right local LLM
for your machine.
Tell us your RAM and GPU. We'll rank every major open-source model by what actually fits — no guessing, no cross-referencing docs.
Hardware-aware
Filter by your exact RAM. See which quantization level actually fits.
Ranked by quality
Models sorted by benchmark scores within each category. Best first.
Comparison mode
Pin up to 3 models and compare specs side-by-side.