Guide

Switch OpenClaw to OpenRouter

Three things. Give the gateway your OPENROUTER_API_KEY, run the onboarding command inside the container, then change your model refs from openai/... to openrouter/....

5 min readFebruary 2026
01

What you're doing

Switching from OpenAI to OpenRouter in OpenClaw requires three changes, in order. Skip any one and the gateway will still try to hit OpenAI.

01
Give the gateway OPENROUTER_API_KEY
Injected via the Docker Compose environment block, so it persists across restarts.
02
Onboard OpenRouter credentials in OpenClaw
A single CLI command that writes auth in the place the gateway expects.
03
Change model refs and restart
Swap openai/... for openrouter/... in your config, then restart the gateway.
02

Get an OpenRouter key

Create an API key at openrouter.ai. It looks like sk-or-.... Treat it like a password. Don't commit it to git.

03

Inject the key into Docker Compose

Add OPENROUTER_API_KEY to the environment block of your OpenClaw gateway service in docker-compose.yml:

docker-compose.yml
services:
  openclaw:
    environment:
      - OPENROUTER_API_KEY=sk-or-REPLACE_ME

Then restart the container so it picks up the new variable:

docker compose up -d
# or
docker compose restart
Putting the key in the Compose file (rather than a shell export) means it survives reboots and container recreations automatically.
04

Run OpenClaw onboarding inside the container

With the container running and the key injected, exec the onboarding command directly from the host:

docker compose exec openclaw openclaw onboard --auth-choice apiKey --token-provider openrouter --token "$OPENROUTER_API_KEY"

If your service name isn't openclaw, replace it with whateverdocker compose ps shows. The flag that matters is --token-provider openrouter. That's what tells OpenClaw to authenticate against OpenRouter instead of OpenAI.

05

Switch your model refs

OpenRouter model references follow this format:

openrouter/<provider>/<model>

Common replacements:

openai/gpt-5.2openrouter/openai/gpt-4o-mini
anthropic/claude-opusopenrouter/anthropic/claude-sonnet-4-5
any openai/... refopenrouter/deepseek/deepseek-r1

After updating your config, restart the gateway so the new model refs take effect:

docker compose restart
06

Keep OpenAI as a fallback (recommended)

You don't have to drop OpenAI entirely. A cross-provider fallback chain protects you if OpenRouter or any routed provider has an outage.

~/.openclaw/openclaw.json
{
  "agents": {
    "defaults": {
      "model": {
        "primary": "openrouter/anthropic/claude-sonnet-4-5",
        "fallbacks": [
          "openai/gpt-4o"
        ]
      }
    }
  }
}
A cross-provider fallback is more resilient than two models from the same provider. if Anthropic has an outage, all Anthropic-routed models slow down together. Routing the fallback through OpenAI directly means a different failure domain.

After updating the config, restart once more:

docker compose restart

Done

Inject the key into Compose, run onboard inside the container, swap your model refs, restart. OpenClaw now routes through OpenRouter with every model on the platform available.