This project highlights how to leverage a ChromaDB vector store in a Langchain pipeline to create a chat with a Pdf application. You can load in a pdf based document and use it alongside an LLM without fine-tuning.
- Create a virtual environment
python -m venv langchainenv - Activate it:
- Windows:
.\langchainenv\Scripts\activate - Mac: `source langchain/bin/activate'
- Windows:
- Clone this repo
git clone https://github.com/nicknochnack/LangchainDocuments - Go into the directory
cd LangchainDocuments - Install the required dependencies
pip install -r requirements.txt - Add your OpenAI APIKey to line 52 of
app.py - Start the app
streamlit run app.py - Load the Pdf you would like to ask questions
- Ask questions and get the answers
