Run Open-source Large Language Models Locally Using Hugging Face Transformers

Explore LLMs like Mistral and Neural Chat on your own computer

Eric Kleppen
8 min readDec 11, 2023
My Computer (image by author)

Why Go Open-source When GPT-4 Exists?

Although close, open-source large language models (LLMs) can’t yet match the power and accuracy of the closed-source, commercially available apps like GPT-4 and Bard (Gemini). Even though they are less powerful, there are several reasons why you might want to run open-source LLMs locally:

  • Local models are free to use on your own machine while commercial models like GPT-4 are often pay-to-use APIs.
  • Local models can be used without an internet connection while GPT-4 requires you to be connected.
  • Local models offer you privacy while GPT-4 can monitor your usage and use your data for training.
  • Local models can produce uncensored content while closed-source models are censored and use guardrails to prevent the model from producing certain content.

The censorship on closed-source LLMs has become a hotly discussed topic. For example, if you ask Chat-GPT, “What are some deadly poisons?” it will respond by saying it cannot provide information on harmful activities. When I ask my locally running LLM…

--

--

Eric Kleppen
Eric Kleppen

Responses (3)