Use a Codex-compatible Responses API endpoint inside Claude Code via Claude Code Router on Windows.
This repo packages the working adapter setup I built locally:
- Claude Code Router config
- a custom Responses transformer for Codex-style endpoints
- a
claude-codexlauncher - background startup scripts
- Windows logon auto-start registration
Claude Code speaks Anthropic-style APIs.
This project inserts a local bridge:
Claude Code -> local Claude Code Router -> Codex-compatible /responses endpoint
Target endpoint shape:
- OpenAI Responses-compatible
- model:
gpt-5.4 - reasoning effort requested as
xhigh - service tier requested as
fast - long-context model family (
~1.05Mcontext on GPT-5.4 per OpenAI docs)
This repo requests:
"service_tier": "fast"However, the actual upstream provider may still return or use another tier (for example auto).
So there are two different questions:
- Does this bridge send
fast? Yes. - Does the upstream honor it? Provider-dependent; verify from the upstream response.
bin/claude-codex.js- Claude Code launcher that ensures the router is running firstbin/claude-codex.cmd- Windows wrapperbin/ccr-service.ps1- foreground router service runnerbin/start-ccr-background.ps1- background launcherrouter/plugins/codex-cli.js- custom Responses transformerscripts/install.ps1- installer for local use
- Windows
- Node.js
- Claude Code CLI installed (
claude) - Claude Code Router installed globally (
ccr) - GitHub CLI optional
- Environment variable:
CODEX_API_KEY
From PowerShell:
powershell -ExecutionPolicy Bypass -File .\scripts\install.ps1What the installer does:
- copies launch scripts into
~/.local/bin - copies the transformer into
~/.claude-code-router/plugins - generates a random local router API key
- picks a free high port
- writes
~/.claude-code-router/config.json - registers a
ClaudeCodexRouterscheduled task for logon auto-start
claude-codexQuick test:
claude-codex -p "Reply with exactly: pong"$cfg = Get-Content -Raw "$HOME/.claude-code-router/config.json" | ConvertFrom-Json
Invoke-WebRequest -UseBasicParsing "http://127.0.0.1:$($cfg.PORT)/health"You need to inspect the upstream response.created payload.
If it comes back with "service_tier":"auto", then the bridge requested fast but the provider did not honor it.
This repo does not publish:
- your real local router API key
- your personal
CODEX_API_KEY - your private Claude skills
- your SSH mappings
- your local MCP credentials
The custom transformer is based on community Claude Code Router Responses transformer work and then modified locally for:
xhighreasoning supportservice_tierforwarding- this launcher flow