Study Buddy is an interactive AI-powered study assistant that helps users engage with their study materials through a chat interface. The application uses LangChain and Ollama to provide intelligent responses based on uploaded PDF documents.
- PDF document processing and analysis
- Interactive chat interface
- Support for multiple LLM models (llama2, mistral)
- Adjustable AI temperature settings
- Conversation memory
- Document embedding and semantic search
- Python 3.x
- Ollama installed and running on your system
- Clone the repository and install the required dependencies:
pip install streamlit langchain faiss-cpu pypdf python-magic-bin llamaindex
pip install langchain-community
pip install "unstructured[pdf]" python-magic
- Start the application:
streamlit run app.py
-
Configure the application:
- Enter the directory path containing your PDF study materials
- Select your preferred LLM model (llama2 or mistral)
- Adjust the temperature setting for AI responses
- Click "Initialize Study Buddy" to process your documents
-
Start chatting with Study Buddy about your study materials
The application allows you to configure:
- Document directory path
- LLM model selection (llama2 or mistral)
- AI temperature (0.0 - 1.0)
- Streamlit: Web interface
- LangChain: LLM framework
- FAISS: Vector store for document embeddings
- Ollama: LLM model provider
- DirectoryLoader: PDF document processing
study-buddy/
├── app.py # Main application file
├── requirements.txt # Project dependencies
└── README.md # Project documentation
Contributions are welcome! Please feel free to submit a Pull Request.