
Written By
Chris Weaver
Published On
Aug 22, 2024
Empower Your Data: Local Document Chat with Llama 3.1
Many companies cannot send their internal data, documents, and conversations with an external third party such as OpenAI. In order to benefit from AI without sharing their data, these companies can turn to self-hosting open source models. In this post we’ll share how you can connect a state-of-the-art LLM to your own knowledge sources/documents and run everything locally without connecting to the internet.
Llama 3.1 405B is the first opely available model that rivals the top AI models when it comes to state-of-the-art capabilities in general knowledge, steerability, math, tool use, and multilingual translation. At the time of writing, the 405B model is ranked fifth on LMSYS Chatbot Arena Leaderboard. The upgraded versions of 8B and 70B models are multilingual and have a significantly longer context length of 128K, state-of-the-art tool use, and overall stronger reasoning capabilities.
With the increased context length, we can start making better use of these open source models by adding our private knowledge sources. To set up the Llama model locally and connect it to your company knowledge base follow these steps:
Step 1: Install and Configure Ollama
Step 2: Configure Danswer to Use Ollama


Step 3: Integrate Your Knowledge Sources

2. Chat with Your Docs: With everything set up you can now query your company data using Llama3.1 and keep all of your data local!

If you need any help setting up something similar, feel free to join our Slack.