AI & Machine Learning
The AI package provides a self-hosted AI platform with a chat interface and unified model gateway. It supports both local models (via Ollama) and cloud providers (OpenAI, Anthropic, Google).
Services
| Service | Description | Deploy |
|---|---|---|
| Open WebUI | ChatGPT-like interface for AI models | ./uis deploy openwebui |
| LiteLLM | Unified API gateway for multiple LLM providers | ./uis deploy litellm |
Quick Start
./uis stack install ai-local
Or deploy individually:
./uis deploy postgresql # Required dependency
./uis deploy litellm
./uis deploy openwebui
How It Works
Users → Open WebUI → LiteLLM → Ollama (local models)
→ OpenAI API
→ Anthropic API
→ Google Gemini API
- Users interact with Open WebUI's chat interface
- Open WebUI sends requests to LiteLLM's OpenAI-compatible API
- LiteLLM routes to the appropriate model provider
- Conversations are stored in PostgreSQL
All services deploy to the ai namespace.
Guides
- Model access configuration — control which models users can see
- LiteLLM client keys — generate API keys for external tools
- Environment management — manage the AI infrastructure