Show HN: I built a tool to check if your computer can run LLMs locally

caniusellm.com

6 points by ershad65 6 hours ago

Built a simple web app that tells you which open-source LLMs will work on your hardware. It auto-detects your specs, shows compatible models from Hugging Face, gives realistic performance estimates (tokens/sec), and recommends quantization settings. You can also manually input specs to see "what if I upgraded my RAM?" Made this after wasting time downloading giant models only to find they crawled on my hardware. Hope it saves you some frustration!

yohskar 3 hours ago

In firefox, gets wrong ram, cores and gpu. In chrome, gets wrong ram and cores