A universal HTTP proxy and Python library that automatically routes Anthropic or OpenAI API calls to local CLI tools (Claude Code or Codex) when the API key is set to all 9s, otherwise uses the real cloud APIs.
An HTTP/HTTPS proxy server that intercepts Anthropic or OpenAI API calls from any application, regardless of programming language.
Drop-in replacements for the Anthropic and OpenAI Python clients that handle routing internally, routing to Claude Code or Codex when appropriate.
- Universal Compatibility: HTTP proxy works with ANY programming language or tool
- Transparent Routing: No code changes needed for existing projects (with proxy)
- Claude Code & Codex Integration: Automatically routes to local Claude Code or Codex CLI when API key is all 9s
- API Compatibility: Maintains Anthropic/OpenAI API response format
- Easy Configuration: Just set your API key to all 9s to enable local routing
- Python Library: Drop-in replacement clients for Anthropic and OpenAI Python SDKs
- Async Support: Includes both synchronous and asynchronous clients
pip install -r requirements.txt
# On Windows you can also use: py -m pip install -r requirements.txtMake sure you have the relevant CLI installed and available in your PATH:
claude --version # for Claude Code
codex --version # for Codex- Setup and start the proxy:
python setup_proxy.py # One-time setup
# macOS / Linux
./start_proxy.sh # Start proxy server
# Windows
python start_proxy.py- Configure your environment (examples):
macOS/Linux (bash/zsh):
export HTTP_PROXY=http://localhost:8080
export HTTPS_PROXY=http://localhost:8080
export ANTHROPIC_API_KEY=999999999999 # All 9s for Claude Code
export OPENAI_API_KEY=999999999999 # All 9s for CodexWindows (PowerShell):
$env:HTTP_PROXY="http://localhost:8080"
$env:HTTPS_PROXY="http://localhost:8080"
$env:ANTHROPIC_API_KEY="999999999999" # All 9s for Claude Code
$env:OPENAI_API_KEY="999999999999" # All 9s for Codex- Use from ANY language/tool:
# Python, Node.js, cURL, etc - all work through the proxy!
curl https://api.anthropic.com/v1/messages \
-H "x-api-key: 999999999999" \
-H "content-type: application/json" \
-d '{"model":"claude-3-sonnet-20240229","messages":[{"role":"user","content":"Hello"}],"max_tokens":50}'By default the proxy only permits a curated set of /v1 API paths. The default
configuration covers common Anthropic and OpenAI endpoints and falls back to
allow any path under /v1/.
To permit other endpoints you can either override the entire allow-list or extend it:
-
Override with a comma-separated list via the
ALLOWED_PATHSenvironment variable or--allowed-pathsoption:ALLOWED_PATHS="^/v1/my/endpoint$" python start_proxy.py # or python start_proxy.py --allowed-paths '^/v1/my/endpoint$'
-
Extend the defaults by passing
--allowed-pathone or more times:python start_proxy.py --allowed-path '^/v1/beta$' --allowed-path '^/v1/other$'
Patterns are regular expressions that are combined at startup. This allows new API endpoints to be exposed through the proxy without modifying the source code.
from anthropic_router import create_client
# Use Codex locally
client = create_client(provider="codex", api_key="999999999999")
# Or use Claude Code locally
# client = create_client(provider="claude", api_key="999999999999")
# Or use cloud providers
# client = create_client(provider="anthropic", api_key="sk-ant-real-key")
message = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=100,
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
print(message.content[0].text)Valid values for provider are "claude", "anthropic", "codex", and "openai". Passing any other value to create_client or via the AI_ROUTER_DEFAULT environment variable will raise a ValueError.
- When you create a client with an API key that's all 9s (e.g., "999999999999"), the router automatically routes requests to the local Claude Code or Codex CLI
- The router converts standard Anthropic/OpenAI API format to the respective local CLI format
- Responses from the local CLI are converted back to the standard API format
- Your code doesn't need to change—it behaves like the official Anthropic or OpenAI client
See example.py for comprehensive examples including:
- Basic usage
- System prompts
- Multi-turn conversations
- Async operations
- Environment variable configuration
Run the examples:
python example.pyIf you need to adapt an existing application so it can call Claude Code or Codex
directly via the locally installed CLIs (e.g., Claude Max or ChatGPT Pro
subscriptions), read docs/direct_llm_integration.md.
The guide explains how the proxy works, what preconditions must hold, and how to
translate API payloads into CLI prompts and back without ever storing API keys
in your codebase.
Run the test suite to verify the routing works correctly:
pytestThe following API key formats will trigger local routing (Claude Code or Codex):
"999999999999"- Pure 9s"sk-ant-999999999999"or"sk-openai-999999999999"- With standard prefix- Any string where the last segment (after splitting by
-) is all 9s
- Streaming is not yet supported when routing to Claude Code or Codex
- Token counting is approximate when using local CLI tools
- Some advanced API features may not be available through Claude Code or Codex
proxy_server.py- HTTP/HTTPS proxy serverclaude_code_proxy_handler.py- Proxy request handler for Claude Codesetup_proxy.py- One-time setup script for proxystart_proxy.sh- Convenient proxy launcher scriptstart_proxy.py- Cross-platform proxy launcher scripttest_universal.py- Tests for multiple languages/tools
anthropic_router.py- Anthropic/Claude Code routing logicopenai_router.py- OpenAI/Codex routing logicclaude_code_client.py- Claude Code CLI interfacecodex_client.py- Codex CLI interfaceexample.py- Python library usage examplestest_router.py- Anthropic/Claude Code routing teststest_openai_router.py- OpenAI/Codex routing tests
requirements.txt- Python dependenciestest_simple.py- Simple proxy test