Jiragen is a CLI designed to automate the creation of JIRA Issues through the use of Large Language Models (LLMs). It is fully integrated with JIRA API and your Local codebase accelerating jira issue creation and enabling developers to focus on other aspects of their projects. Full documentation is available here.
- Key Features
- Quick Start
- Usage Examples
- Configuration Options
- Template Customization
- Contributing
- License
- 🧠 Local LLM Integration: Leverages Ollama (via LiteLLM) for local text generation
- 🔍 Context-Aware Issues: Smart codebase analysis with vector store integration
- 🎯 Gitignore Support: Respects .gitignore patterns when indexing codebase
- ✨ Customizable Templates: Flexible issue templates for different needs
- 🔧 Smart Metadata Extraction: Automatic extraction of issue type, priority, and labels
- ⚙️ Interactive Workflow: Review and modify content before uploading
Install JiraGen and its dependencies:
pip install jiragen
Install & run Ollama to use your local LLM:
curl https://ollama.ai/install.sh | sh
ollama pull phi4 # Replace with your preferred model
or export your OpenAI API key:
export OPENAI_API_KEY="YOUR_API_KEY"
# Initialize configuration
jiragen init
# Index your codebase (respects .gitignore)
jiragen add .
# Generate your first issue
jiragen generate "Implement user authentication"
# Add all files (respects .gitignore)
jiragen add .
# Add specific files or directories
jiragen add src/main.py
# Remove files
jiragen rm src/deprecated/
# Basic generation
jiragen generate "Add dark mode support"
# With custom template and model
jiragen generate "API rate limiting" \
--template templates/feature.md \
--model ollama/codellama
# Generate and upload to JIRA
jiragen generate "Fix memory leak" --upload --yes
# Interactive editing
jiragen generate "OAuth integration" --editor
# View indexed files
jiragen status
jiragen status --compact
jiragen status --depth 2
# Fetch JIRA data
jiragen fetch --types epics tickets
# Restart vector store
jiragen restart
JiraGen can be configured through:
- Command-line arguments
- Configuration file (
~/.jiragen/config.ini
) - Environment variables
# Python API configuration
LLMConfig(
model="llama2", # Ollama model to use
api_base="http://localhost:11434", # Ollama endpoint
max_tokens=2000,
temperature=0.7,
top_p=0.95,
)
Create templates to match your organization's needs:
# {title}
## Description
{description}
## Acceptance Criteria
{acceptance_criteria}
## Technical Implementation
{implementation_details}
## Testing Strategy
- Unit Tests
- Integration Tests
- E2E Tests
We ❤️ contributions! To contribute:
- Fork the repository
- Create a feature branch
- Submit a pull request with a detailed description
For more details, refer to our CONTRIBUTING.md.
JiraGen is released under the MIT License.