Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Apr 26, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
The AI-native proxy server for agents. Arch handles the pesky heavy lifting in building agentic apps - routing prompts to agents or specific tools, clarifying input, unifying access and observability to any LLM - so you can build smarter and ship faster.
🦄云原生、超高性能 AI&API网关,LLM API 管理、分发系统、开放平台,支持所有AI API,不限于OpenAI、Azure、Anthropic Claude、Google Gemini、DeepSeek、字节豆包、ChatGLM、文心一言、讯飞星火、通义千问、360 智脑、腾讯混元等主流模型,统一 API 请求和返回,API申请与审批,调用统计、负载均衡、多模型灾备。一键部署,开箱即用。Cloud native, ultra-high performance AI&API gateway, LLM API management, distribution system, open platform, supporting all AI APIs.
Govern, Secure, and Optimize your AI Traffic. AI Gateway provides unified interface to all LLMs using OpenAI API format with a focus on performance and reliability. Built in Rust.
Modular, open source LLMOps stack that separates concerns: LiteLLM unifies LLM APIs, manages routing and cost controls, and ensures high-availability, while Langfuse focuses on detailed observability, prompt versioning, and performance evaluations.
The reliability layer between your code and LLM providers.
This is a robust and configurable LLM proxy server built with Node.js, Express, and PostgreSQL. It acts as an intermediary between your applications and various Large Language Model (LLM) providers
Connect, setup, secure and seamlessly manage LLM models using an Universal/OpenAI compatible API
Bella OpenAPI是一个提供了丰富的AI调用能力的API网关,可类比openrouter,与之不同的是除了提供聊天补全(chat-completion)能力外,还提供了文本向量化(text-embedding)、语音识别(ASR)、语音合成(TTS)、文生图、图生图等多种AI能力,同时集成了计费、限流和资源管理功能。且集成的所有能力都经过了大规模生产环境的验证。
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
A lightweight proxy for LLM API calls with guardrails, metrics, and monitoring. A vibe coding experiment.
A unified API gateway for multiple LLM providers with OpenAI-compatible endpoints
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
LLMSecOps focuses on integrating security practices within the lifecycle of machine learning models. It ensures that models are robust against threats while maintaining compliance and performance standards.
Add a description, image, and links to the llm-gateway topic page so that developers can more easily learn about it.
To associate your repository with the llm-gateway topic, visit your repo's landing page and select "manage topics."