A comprehensive guide and collection of examples for building production-ready Retrieval-Augmented Generation (RAG) systems using various open-source tools. This repository demonstrates different approaches to implementing RAG pipelines, from local LLM deployment to vector stores and evaluation frameworks.
- Multiple local LLM deployment options
- Vector store implementations and examples
- RAG evaluation frameworks and metrics
- Production-ready examples
- Comprehensive documentation for each component
- Ollama - Easy-to-use tool for running LLMs locally
- LocalAI - OpenAI-compatible API for local model deployment
- LMStudio - Desktop application with user-friendly interface
- vLLM - High-performance inference engine with PagedAttention
- Milvus Demo - E-commerce semantic search implementation
- OpenLit - Fast inference engine with CUDA optimization
- Basic RAG - Simple RAG implementation example
- Resume Screener - Practical RAG application for resume analysis
- RAG Evaluator - Tools and metrics for RAG evaluation
- DeepEval Demo - Comprehensive RAG evaluation using DeepEval
Each component has its own setup instructions in its respective directory. Generally, you'll need:
- Python 3.8+
- Conda (recommended) or pip
- GPU (optional, but recommended for better performance)
- Clone the repository:
git clone https://github.com/yourusername/RAG-Agnostic-Guide.git
cd RAG-Agnostic-Guide
- Choose a component and follow its specific setup instructions in the respective README.
Each component includes detailed documentation covering:
- Setup instructions
- Usage examples
- API references
- Performance considerations
- Best practices
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
Special thanks to all the open-source projects and their maintainers that make this guide possible:
- Ollama team
- LocalAI community
- LMStudio developers
- vLLM contributors
- Milvus community
- And many others!
For detailed information about specific components, please refer to their respective directories.