Tech Expert & Vibe Coder

With 15+ years of experience, I specialize in self-hosting, AI automation, and Vibe Coding – building applications using AI-powered tools like Google Antigravity, Dyad, and Cline. From homelabs to enterprise solutions.

Self-Hosting Open WebUI Behind Nginx Proxy Manager with SSO Authentication

Self-Hosting 6 min read Published Mar 19, 2026

Why I Set This Up

I run Open WebUI on my Proxmox homelab to interact with local LLMs and OpenAI models. Initially, I accessed it directly via IP and port, which worked fine on my local network. But I wanted proper remote access with a clean domain name and real authentication — not just “hope nobody finds the URL.”

I already had Nginx Proxy Manager handling my other self-hosted services, so routing Open WebUI through it made sense. The SSO part came later when I realized I didn’t want multiple login systems across my homelab. I wanted one identity provider that could protect everything.

My Setup

Here’s what I’m actually running:

  • Proxmox VE hosting multiple LXC containers and VMs
  • Open WebUI in a Docker container (image: ghcr.io/open-webui/open-webui:main)
  • Nginx Proxy Manager in its own Docker container (image: jc21/nginx-proxy-manager:latest)
  • Authentik as my SSO provider, also containerized

Open WebUI runs on port 8080 internally. It’s not exposed to the internet directly — only NPM has public access. Everything sits behind my home ISP connection with a static IP and proper DNS records pointing to my domain.

Docker Compose for Open WebUI

My Open WebUI container looks like this:

version: '3.8'
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    ports:
      - "8080:8080"
    volumes:
      - ./data:/app/backend/data
    environment:
      - OAUTH_CLIENT_ID=your-client-id
      - OAUTH_CLIENT_SECRET=your-client-secret
      - OPENID_PROVIDER_URL=https://auth.yourdomain.com/application/o/open-webui/.well-known/openid-configuration
      - ENABLE_OAUTH_SIGNUP=true
      - OPENID_REDIRECT_URI=https://chat.yourdomain.com/oauth/oidc/callback
      - OAUTH_PROVIDER_NAME=Authentik
      - OAUTH_SCOPES=openid email profile
      - WEBUI_SESSION_COOKIE_SAME_SITE=lax
      - WEBUI_AUTH_COOKIE_SAME_SITE=lax
      - WEBUI_SESSION_COOKIE_SECURE=true
      - WEBUI_AUTH_COOKIE_SECURE=true
      - AIOHTTP_CLIENT_TIMEOUT=600
    restart: unless-stopped

The data volume persists conversation history and settings. The environment variables configure OAuth authentication through Authentik.

Nginx Proxy Manager Configuration

In NPM’s web interface, I created a proxy host with these settings:

  • Domain: chat.yourdomain.com
  • Scheme: http
  • Forward Hostname/IP: open-webui (Docker container name on the same network)
  • Forward Port: 8080
  • WebSocket Support: Enabled
  • SSL: Force SSL enabled, Let’s Encrypt certificate

WebSocket support is critical. Without it, Open WebUI’s real-time features break. I learned this the hard way when streaming responses would freeze mid-sentence.

Custom Nginx Configuration

The default NPM proxy config wasn’t enough. OAuth endpoints were getting cached, causing random login failures. I added this custom configuration in the Advanced tab:

location ~* ^/(api|oauth|callback|login|ws|websocket) {
    proxy_pass http://open-webui:8080;
    proxy_http_version 1.1;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";
    proxy_no_cache 1;
    proxy_cache_bypass 1;
    proxy_read_timeout 3600s;
    proxy_send_timeout 3600s;
}

This disables caching for authentication and API endpoints, sets proper headers for WebSocket upgrades, and increases timeouts for long-running LLM responses.

Authentik Setup

I chose Authentik over Keycloak because it felt lighter and more straightforward for my needs. I’m not running an enterprise — just a homelab with a handful of services.

Authentik runs at auth.yourdomain.com, also proxied through NPM. The key steps I followed:

  1. Downloaded the official docker-compose.yml from Authentik’s documentation
  2. Generated secure passwords for initial setup
  3. Created a proxy host in NPM pointing to the Authentik container
  4. Enabled WebSocket support for the Authentik proxy host (required for the admin panel)
  5. Forced SSL and obtained a Let’s Encrypt certificate

Creating the OAuth Provider

Inside Authentik’s admin panel:

  1. Went to Applications → Providers
  2. Created a new OAuth2/OpenID Provider
  3. Set the Authorization Flow to “Authorize Application (Implicit)”
  4. Set Redirect URIs to https://chat.yourdomain.com/oauth/oidc/callback
  5. Copied the Client ID and Client Secret
  6. Created an Application linking to this provider

The client credentials go into Open WebUI’s environment variables. The redirect URI must match exactly — I wasted an hour troubleshooting a typo here.

What Worked

Once everything was configured correctly, the flow works like this:

  1. User visits https://chat.yourdomain.com
  2. Open WebUI redirects to Authentik for authentication
  3. User logs in through Authentik
  4. Authentik redirects back to Open WebUI with an auth token
  5. User is logged into Open WebUI

The WebSocket connection stays alive during long conversations. Streaming responses from local models work without interruption. Sessions persist across browser restarts.

Cookie settings with SameSite=lax prevent cross-site issues while allowing the OAuth flow to complete. Setting cookies to secure=true enforces HTTPS, which is necessary when running behind a reverse proxy.

What Didn’t Work

My first attempt failed because I didn’t disable caching for OAuth endpoints. Users would log in successfully, then get kicked out randomly. The logs showed authentication failures, but the actual cause was Nginx serving cached 302 redirects instead of fresh OAuth responses.

I also initially set cookie SameSite to None, thinking it would be more permissive. That broke the login flow entirely. Modern browsers reject SameSite=None cookies unless they’re also Secure, which created a chicken-and-egg problem during OAuth redirects.

Timeouts were another issue. The default AIOHTTP_CLIENT_TIMEOUT in Open WebUI is too low. When querying slower models or waiting for Authentik to respond, requests would time out before completing. Bumping it to 600 seconds fixed this.

WebSocket support in NPM is a checkbox, but I initially missed it. Open WebUI would load, but real-time features like streaming text wouldn’t work. The browser console showed WebSocket connection failures.

Key Takeaways

This setup works reliably for my homelab. A few things I’d emphasize:

  • WebSocket support must be enabled in NPM for both Open WebUI and Authentik
  • OAuth endpoints cannot be cached — add explicit no-cache rules
  • Cookie settings matter more than I expected — lax is the right choice here
  • Redirect URIs must match exactly between Open WebUI and Authentik
  • High timeouts are necessary for LLM responses and OAuth provider latency

The main benefit is centralized authentication. I now have one login for Open WebUI, my file server, monitoring dashboards, and other services. When I add a new service, I just create another provider in Authentik and plug in the credentials.

The main drawback is complexity. If Authentik goes down, everything protected by it becomes inaccessible. I keep direct access methods for critical services just in case.

For anyone running a similar setup: test the OAuth flow thoroughly before relying on it. Enable debug logging in Open WebUI temporarily to see exactly where authentication requests succeed or fail. The logs show each step of the OAuth handshake, which makes troubleshooting much faster.

Previous article

Debugging Docker Compose Healthcheck Failures with Fish Shell Flag Explainers: Using Shellock to Understand Complex Curl and Nc Timeout Parameters

Next article

Building a Local RAG Pipeline with Ollama, ChromaDB, and Automatic Document Ingestion

Leave a Comment

Your email address will not be published. Required fields are marked *

Search Articles

Jump to another topic without leaving the reading flow.

Categories

Browse more posts grouped by topic.

About the Author

Vipin PG

Vipin PG

Expert Tech Support & Services

Vipin PG is a software professional with 15+ years of hands-on experience in system infrastructure, browser performance, and AI-powered development. Holding an MCA from Kerala University, he has worked across enterprises in Dubai and Kochi before running his independent tech consultancy. He has written 180+ tutorials on Docker, networking, and system troubleshooting - and he actually runs the setups he writes about.

Stay Updated

Get new posts and practical tech notes in your inbox.

Short, high-signal updates covering self-hosting, automation, AI tooling, and infrastructure fixes.