Link parkin’: 50+ Open-Source Options for Running LLMs Locally
Vince Lam put together a comprehensive resource for running LLMs on your own hardware:
There are many open-source tools for hosting open weights LLMs locally for inference, from the command line (CLI) tools to full GUI desktop applications. Here, I’ll outline some popular options and provide my own recommendations. I have split this post into the following sections:
- All-in-one desktop solutions for accessibility
- LLM inference via the CLI and backend API servers
- Front-end UIs for connecting to LLM backends