Private AI Platform · v1.0

The AI stack that ships itself.

Passkey identity. Workflow automation. Multi-model chat. Full observability. One orchestrator clones it onto your boxes in under ten minutes — no SaaS, no vendor lock-in, your data never leaves your perimeter.

  • Ubuntu 24.04
  • Docker & Compose
  • Let's Encrypt via DNS-01
  • OIDC (passkeys)
What you get

Seven services. One orchestrator. Zero glue code.

Every panel below is a real container with a real health check, wired together by a single install.sh. Own the runtime; skip the integration labor.

01 Identity

Passkeys, not passwords.

PocketID issues OIDC to every service on the platform. Passkey-first. Admin-scoped API keys. SSO loops configured in one line of the wizard.

  • OAuth2 / OIDC provider
  • WebAuthn, FIDO2, platform authenticators
  • Forward-auth gateway for every subdomain
02 Automation

N8N, pre-wired.

Webhooks, schedules, queue workers. Uses the platform Postgres and inherits SSO — no separate user store.

03 Multi-tenant Chat

OpenWebUI × LiteLLM.

Per-tenant LLM workspaces with OpenAI-compatible APIs. OpenRouter, Anthropic, OpenAI, or your own endpoints.

04 Observability

Prometheus + Kuma.

Every service scraped. Uptime, response-time, node, cgroup, container metrics. Alert to Slack or email.

05 Reverse Proxy

Caddy with wildcard TLS.

DNS-01 via Cloudflare. Auto-renewing wildcards across multiple zones. Single-host or fleet.

06 Shared Data

One Postgres, tenants isolated.

A single hardened Postgres 16 with per-app databases, per-app roles, nightly backups. PocketID, N8N, Kuma, and each AI-Chat tenant share the cluster but never each other's rows.

obs_platform / pocketid / n8n / uptimekuma / <tenant>_db / <tenant>_litellm
07 Post-install

Day-2 is already done.

DNS records, OIDC clients, Kuma monitors — all driven headlessly by install.sh configure-* once you've enrolled the first admin passkey.

How it deploys

One command. Two hosts. Your infrastructure.

Point the wizard at a controller and a target. Set a domain and a Cloudflare token. Wait eight minutes.

  • No SaaS; nothing calls home.
  • Data is yours — the Postgres volume is on your disk.
  • Secrets are chmod 600 in /root/.obs-ai-platform/.
  • Idempotent: re-running never corrupts live state.
  • Audit trail: every run logged to /var/log/deploy/.
controller ~ $
# Clone the orchestrator
$ git clone git@github.com:obscorp-tech/obs-ai-services-platform-install
$ cd obs-ai-services-platform-install

# Run the wizard, then install
$ sudo ./install.sh wizard
$ sudo ./install.sh install

[STEP 1/7] shared-database ✓
[STEP 2/7] identity       ✓
[STEP 3/7] auth-gateway   ✓
[STEP 4/7] reverse-proxy  ✓
[STEP 5/7] automation     ✓
[STEP 6/7] monitoring     ✓
[STEP 7/7] ai-chat        ✓
[OK] Install Complete — 8m 14s
10containers on one box
<10minclean-slate to live
7services, one orchestrator
0SaaS subscriptions
From kickoff to live

A deployment looks like this.

  1. 01

    Scoping call

    We map your current identity story, data boundaries, and which workflows you want automated first.

  2. 02

    Infrastructure

    Two Ubuntu 24.04 hosts (controller + production) on your cloud, or bare metal, or a mix. We test reachability.

  3. 03

    Platform install

    Wizard once, install once. SSL via Cloudflare DNS-01, SSO across everything, backups on cron.

  4. 04

    First workflows

    We seed three N8N workflows tailored to the priorities from the scoping call. Then we train your team.

Run AI on your own rails.

Introductory deployments ship in under two weeks. Bring a domain and a couple of servers; we bring everything else.