Skip to content

Open WebUI

Web-based chat interface for interacting with local LLMs via Ollama.

Overview

Attribute Value
Image ghcr.io/open-webui/open-webui:main
Size ~2GB
GPU No (UI only, uses Ollama for inference)
Port 3000 (default)

Quick Start

Step Command Description Recording
1 ujust openwebui config Configure UI
2 ujust openwebui start Start UI
3 ujust openwebui status Check status

Access Open WebUI at http://localhost:3000 after starting.

Prerequisites

Open WebUI requires Ollama running:

# Start Ollama first
ujust ollama start

# Then start Open WebUI
ujust openwebui start

Open WebUI automatically connects to Ollama via the bazzite-ai network.

Lifecycle Commands

Command Description Recording
ujust openwebui config Configure settings
ujust openwebui start Start UI
ujust openwebui status Check status
ujust openwebui logs View logs
ujust openwebui stop Stop UI
ujust openwebui delete Remove config

Multiple Instances

Run multiple Open WebUI instances:

# First instance (port 3000)
ujust openwebui config
ujust openwebui start

# Second instance (port 3001)
ujust openwebui config -n 2 --port=3001
ujust openwebui start -n 2

Features

  • Chat interface with conversation history
  • Model selection from available Ollama models
  • Document upload for RAG workflows
  • User authentication and management
  • Dark/light theme support

See Also