Tag: local-llm
All the articles with the tag "local-llm".
50+ Open-Source Options for Running LLMs Locally
Published:There are many open-source tools for hosting open weights LLMs locally for inference, from the command line (CLI) tools to full GUI desktop applications. Here, I’ll outline some popular options and provide my own recommendations. I have split this post into the following sections
Why I Use Open Weights LLMs Locally
Published:As someone who regularly uses Large Language Models (LLMs) for personal use and builds apps with LLMs - the choice between self-hosted open weights LLMs and proprietary LLMs is a recurring theme. Here, I share my personal insights as to why I use locally hosted LLMs.