🎨 Chat Interface Integration

LLM Proxy Open WebUI Integration

Connect Open WebUI to your LLM proxy for a beautiful, feature-rich chat interface. Get a ChatGPT-like experience with full control over your AI providers, conversation history, and user management.

Integration Architecture
💬
Open WebUI
🔀
LLM Proxy
🤖
AI Providers

Open WebUI Features

Powerful chat interface capabilities powered by your proxy backend.

💬

Chat Interface

Beautiful, responsive chat UI with markdown support, code highlighting, and conversation management.

👤

User Management

Built-in authentication, role-based access control, and user-specific conversation histories.

📚

Document Support

Upload and chat with documents. PDF, Word, and text file support with RAG capabilities.

🔧

Model Selection

Switch between models on-the-fly. Access all providers through unified model selection dropdown.

📝

Prompt Templates

Create and share custom prompt templates. Build reusable prompts for common tasks.

🔌

Function Calling

Enable tools and function calling through your proxy. Let AI execute actions in your systems.

Setup Guide

Get Open WebUI connected to your proxy in minutes.

1
Deploy Open WebUI with Docker

Run Open WebUI container connected to your proxy endpoint.

Docker Compose
version: '3.8'
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "3000:8080"
    environment:
      - OPENAI_API_BASE_URL=http://your-proxy:8080/v1
      - OPENAI_API_KEY=your-proxy-key
      - DATA_DIR=/app/backend/data
    volumes:
      - open-webui-data:/app/backend/data
    restart: unless-stopped

volumes:
  open-webui-data:
2
Configure Connection

Access the WebUI settings to configure your proxy connection.

Setting Value Description
API Base URL http://proxy:8080/v1 Your proxy endpoint
API Key sk-proxy-xxx Proxy authentication key
Model openai/gpt-4 Model identifier format
3
Add Multiple Providers

Configure multiple API connections for different providers through your proxy.

Environment Variables
# Open WebUI supports multiple connections
environment:
  # Primary connection through proxy
  - OPENAI_API_BASE_URL=http://proxy:8080/v1
  - OPENAI_API_KEY=sk-proxy-key
  
  # Enable all features
  - ENABLE_RAG_WEB_SEARCH=true
  - ENABLE_IMAGE_GENERATION=true
  - ENABLE_OLLAMA_API=false
  
  # Security
  - WEBUI_SECRET_KEY=your-secret-key
  - ENABLE_SIGNUP=true
50+
UI Features
100%
Open Source
5min
Setup Time
Docker
Deploy Method

Advanced Configuration

Customize Open WebUI for your specific requirements.

🔐

Authentication

Configure OAuth, LDAP, or built-in authentication. Integrate with your existing identity provider.

🎨

Custom Branding

White-label the interface with your logo, colors, and custom domain. Create a seamless experience.

📊

Analytics

Track usage, costs, and user activity. Export conversation logs for compliance and training.

Deploy Open WebUI Today

Get a complete ChatGPT-like interface connected to your LLM proxy in under 5 minutes. Full control, open source, and feature-rich.