Show HN: I built a tool to check if your computer can run LLMs locally
caniusellm.comBuilt a simple web app that tells you which open-source LLMs will work on your hardware. It auto-detects your specs, shows compatible models from Hugging Face, gives realistic performance estimates (tokens/sec), and recommends quantization settings. You can also manually input specs to see "what if I upgraded my RAM?" Made this after wasting time downloading giant models only to find they crawled on my hardware. Hope it saves you some frustration!
In firefox, gets wrong ram, cores and gpu. In chrome, gets wrong ram and cores