OpenWebUI Model Access Setup Guide
This guide covers how to configure model access for users in the OpenWebUI environment that integrates with LiteLLM proxy and Authentik authentication.
LiteLLM Integration Model Access
Overview
OpenWebUI is configured to use LiteLLM as the primary model provider, which gives access to multiple model sources through a unified interface. Models are configured in the LiteLLM ConfigMap and automatically discovered by OpenWebUI.
Model Sources Available:
- Local Ollama models (in-cluster and external)
- Cloud providers (OpenAI, Anthropic, Azure, Google)
- Custom model configurations with different parameters
Current Model Configuration:
Available models from LiteLLM ConfigMap (topsecret/kubernetes/kubernetes-secrets.yml):
mac-gpt-oss-balanced- Local Ollama model with balanced temperaturemac-gpt-oss-creative- Local Ollama model with high temperaturemac-gpt-oss-precise- Local Ollama model with low temperatureexternal-ollama-gemma3- External Ollama Gemma modelgpt-4o- OpenAI GPT-4 Omniazure-gpt-4- Azure OpenAI GPT-4claude-3-opus- Anthropic Claude 3 Opus
OAuth User Model Access Configuration
Default Behavior
When OAuth users log in via Authentik, newly discovered models from LiteLLM proxy default to "Private" visibility for security reasons.
Admin Configuration Steps:
- Login as admin user (local account, not OAuth)
- Navigate to Admin Panel → Settings → Models
- For each LiteLLM model you want OAuth users to access:
- Find the model in the list (e.g.,
mac-gpt-oss-balanced) - Change "Visibility" from "Private" to "Public"
- Configure "Whitelist" if specific user groups should have access
- Click "Save & Update"
- Find the model in the list (e.g.,
- Test with OAuth user - they should now see the models in dropdown
Security Recommendations:
- ✅ Local models: Safe to make public (free, no API costs)
- ⚠️ Cloud models: Carefully control access (paid API usage)
- 🔒 Premium models: Keep private or whitelist specific groups
- 📊 Cost tracking: Monitor usage through LiteLLM admin interface
Group-Based Access Control:
Configure model access by Authentik groups:
- Admin Panel → Settings → Models
- Select model → Advanced Settings
- Whitelist specific groups (matches Authentik group names)
- Apply group restrictions for cost-sensitive models
LiteLLM Model Management
Adding New Models
To add new models to the system:
-
Edit ConfigMap in
topsecret/kubernetes/kubernetes-secrets.yml:model_list:
- model_name: new-model-name
litellm_params:
model: provider/model-id
api_key: "os.environ/API_KEY_NAME" -
Apply changes:
./copy2provisionhost.sh
docker exec -it provision-host bash -c "cd /mnt/urbalurbadisk && kubectl apply -f topsecret/kubernetes/kubernetes-secrets.yml" -
Restart LiteLLM:
kubectl rollout restart deployment/litellm -n ai -
Configure model visibility in OpenWebUI admin panel
Cost Management
- Free models: Local Ollama models have no ongoing costs
- Paid models: Cloud provider models charge per token/request
- Monitoring: Check LiteLLM logs for usage and costs
- Budget control: Use model whitelisting for expensive models
Troubleshooting:
- OAuth user still can't see models? Check that model visibility is set to "Public"
- New models not appearing? They default to "Private" - admin must make them "Public"
- Admin can see models but OAuth user cannot? This is expected behavior with "Private" models
- Models not loading from LiteLLM? Check ConfigMap format and restart LiteLLM deployment
- API errors for cloud models? Verify API keys are set correctly in secrets