Skip to content

Felix3322/xyai2claudecode

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

xyai2claudecode

Use a Codex-compatible Responses API endpoint inside Claude Code via Claude Code Router on Windows.

This repo packages the working adapter setup I built locally:

  • Claude Code Router config
  • a custom Responses transformer for Codex-style endpoints
  • a claude-codex launcher
  • background startup scripts
  • Windows logon auto-start registration

What this does

Claude Code speaks Anthropic-style APIs.
This project inserts a local bridge:

Claude Code -> local Claude Code Router -> Codex-compatible /responses endpoint

Target endpoint shape:

  • OpenAI Responses-compatible
  • model: gpt-5.4
  • reasoning effort requested as xhigh
  • service tier requested as fast
  • long-context model family (~1.05M context on GPT-5.4 per OpenAI docs)

Important note about fast

This repo requests:

"service_tier": "fast"

However, the actual upstream provider may still return or use another tier (for example auto).

So there are two different questions:

  1. Does this bridge send fast? Yes.
  2. Does the upstream honor it? Provider-dependent; verify from the upstream response.

Files

  • bin/claude-codex.js - Claude Code launcher that ensures the router is running first
  • bin/claude-codex.cmd - Windows wrapper
  • bin/ccr-service.ps1 - foreground router service runner
  • bin/start-ccr-background.ps1 - background launcher
  • router/plugins/codex-cli.js - custom Responses transformer
  • scripts/install.ps1 - installer for local use

Prerequisites

  • Windows
  • Node.js
  • Claude Code CLI installed (claude)
  • Claude Code Router installed globally (ccr)
  • GitHub CLI optional
  • Environment variable:
    • CODEX_API_KEY

Install

From PowerShell:

powershell -ExecutionPolicy Bypass -File .\scripts\install.ps1

What the installer does:

  • copies launch scripts into ~/.local/bin
  • copies the transformer into ~/.claude-code-router/plugins
  • generates a random local router API key
  • picks a free high port
  • writes ~/.claude-code-router/config.json
  • registers a ClaudeCodexRouter scheduled task for logon auto-start

Use

claude-codex

Quick test:

claude-codex -p "Reply with exactly: pong"

Verify router health

$cfg = Get-Content -Raw "$HOME/.claude-code-router/config.json" | ConvertFrom-Json
Invoke-WebRequest -UseBasicParsing "http://127.0.0.1:$($cfg.PORT)/health"

Verify whether upstream honored fast

You need to inspect the upstream response.created payload.
If it comes back with "service_tier":"auto", then the bridge requested fast but the provider did not honor it.

Security

This repo does not publish:

  • your real local router API key
  • your personal CODEX_API_KEY
  • your private Claude skills
  • your SSH mappings
  • your local MCP credentials

Source attribution

The custom transformer is based on community Claude Code Router Responses transformer work and then modified locally for:

  • xhigh reasoning support
  • service_tier forwarding
  • this launcher flow

About

Use a Codex-compatible Responses API endpoint inside Claude Code via Claude Code Router on Windows.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors