✨ Kubectl plugin to create manifests with LLMs
-
Updated
Oct 14, 2024 - Go
✨ Kubectl plugin to create manifests with LLMs
The easiest way to use the Ollama API in .NET
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
Social and customizable AI writing assistant! ✍️
AubAI brings you on-device gen-AI capabilities, including offline text generation and more, directly within your app.
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
Local AI Search assistant web or CLI for ollama and llama.cpp. Lightweight and easy to run, providing a Perplexity-like experience.
Full featured demo application for OllamaSharp
Summarize emails received by Thunderbird mail client extension via locally run LLM. Early development.
Copilot hack for running local copilot without auth and proxying
Use your open source local model from the terminal
Run gguf LLM models in Latest Version TextGen-webui
Add a description, image, and links to the localllama topic page so that developers can more easily learn about it.
To associate your repository with the localllama topic, visit your repo's landing page and select "manage topics."