Skip to content

Open-WebUI-Functions is a collection of custom pipelines, filters, and integrations designed to enhance Open WebUI. These functions enable seamless interactions with Azure AI, N8N, and other AI models, providing dynamic request handling, preprocessing, and automation.

License

Notifications You must be signed in to change notification settings

owndev/Open-WebUI-Functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Open-WebUI-Functions

Open-WebUI-Functions is a collection of Python-based functions designed to extend the capabilities of Open WebUI with additional pipelines, filters, and integrations. These functions allow users to interact with various AI models, process data efficiently, and customize the Open WebUI experience.


Features

  • Custom Pipelines: Extend Open WebUI with AI processing pipelines, including model inference and data transformations.
  • Filters for Data Processing: Apply custom filtering logic to refine, manipulate, or preprocess input and output data.
  • Azure AI Support: Seamlessly connect Open WebUI with Azure OpenAI and other Azure AI models.
  • N8N Workflow Integration: Enable interactions with N8N for automation.
  • Flexible Configuration: Use environment variables to adjust function settings dynamically.
  • Streaming and Non-Streaming Support: Handle both real-time and batch processing efficiently.

Prerequisites

To use these functions, ensure the following:

  1. An Active Open WebUI Instance: You must have Open WebUI installed and running.
  2. Required AI Services (if applicable): Some pipelines require external AI services, such as Azure AI.
  3. Admin Access: To install functions in Open WebUI, you must have administrator privileges.

Installation

To install and configure functions in Open WebUI, follow these steps:

  1. Ensure Admin Access:

    • You must be an admin in Open WebUI to install functions.
  2. Access Admin Settings:

    • Navigate to the Admin Settings section in Open WebUI.
  3. Go to the Function Tab:

    • Open the Functions tab in the admin panel.
  4. Create a New Function:

    • Click Add New Function.
    • Copy the function code from this repository and paste it into the function editor.
  5. Set Environment Variables (if required):

    • Some functions require API keys or specific configurations via environment variables.
  6. Save and Activate:

    • Save the function, and it will be available for use within Open WebUI.

Pipelines

Pipelines are processing functions that extend Open WebUI with custom AI models, external integrations, and data manipulation logic.

  • Enables interaction with Azure OpenAI and other Azure AI models.
  • Supports dynamic model selection via x-ms-model-mesh-model-name headers.
  • Filters valid parameters to ensure clean requests.
  • Handles both streaming and non-streaming responses.
  • Provides configurable error handling and timeouts.

🔗 Azure AI Pipeline in Open WebUI

  • A specialized version of the Azure AI Foundry Pipeline for DeepSeek-R1.
  • Uses Azure’s DeepSeek-R1 AI model for advanced text processing.
  • Includes the same error handling, parameter filtering, and request management as the standard Azure AI Foundry Pipeline.

🔗 Azure AI Pipeline for DeepSeek-R1 in Open WebUI

  • Integrates Open WebUI with N8N, an automation and workflow platform.
  • Sends messages from Open WebUI to an N8N webhook.
  • Supports real-time message processing with dynamic field handling.
  • Enables automation of AI-generated responses within an N8N workflow.
  • Here is an example N8N workflow for N8N Pipeline

🔗 N8N Pipeline in Open WebUI

🔗 Learn More About N8N


Filters

Filters allow for preprocessing and postprocessing of data within Open WebUI.

  • Measures response time and token usage for AI interactions.
  • Supports tracking of total token usage and per-message token counts.
  • Can calculate token usage for all messages or only a subset.
  • Uses OpenAI's tiktoken library for accurate token counting.

🔗 Time Token Tracker in Open WebUI


Azure AI Integration

The repository includes functions specifically designed for Azure AI, supporting both Azure OpenAI models and general Azure AI services.

Features:

  • Azure OpenAI API Support: Access models like GPT-4, GPT-3.5, and other fine-tuned AI models via Azure.
  • Azure AI Model Deployment: Connect to custom models hosted on Azure AI.
  • Dynamic Model Selection: Choose models via the x-ms-model-mesh-model-name header or environment variables.
  • Secure API Requests: Supports API key authentication and environment variable configurations.

Environment Variables:

For Azure AI-based functions, set the following:

AZURE_AI_API_KEY="your-api-key"
AZURE_AI_ENDPOINT="https://your-service.openai.azure.com/chat/completions?api-version=2024-05-01-preview"
AZURE_AI_MODEL="gpt-4o"  # Optional model name, only necessary if not Azure OpenAI or if model name not in URL (e.g. "https://<your-endpoint>/openai/deployments/<model-name>/chat/completions").

For further details, check the Azure AI Function in Open WebUI..

About

Open-WebUI-Functions is a collection of custom pipelines, filters, and integrations designed to enhance Open WebUI. These functions enable seamless interactions with Azure AI, N8N, and other AI models, providing dynamic request handling, preprocessing, and automation.

Topics

Resources

License

Stars

Watchers

Forks

Languages