Skip to content

feat: add GitHub Copilot as a third provider#1

Open
rujaksoto wants to merge 1 commit intorizqme:mainfrom
rujaksoto:feat/copilot-provider
Open

feat: add GitHub Copilot as a third provider#1
rujaksoto wants to merge 1 commit intorizqme:mainfrom
rujaksoto:feat/copilot-provider

Conversation

@rujaksoto
Copy link
Copy Markdown

Summary

Add support for GitHub Copilot subscriptions alongside the existing Claude MAX and ChatGPT providers. This enables routing API requests through Copilot using GitHub's Device Flow OAuth for authentication.

New Files

  • src/copilot-oauth.ts — GitHub Device Flow OAuth (RFC 8628) with device code request, polling, and browser auto-open
  • src/copilot-token-manager.ts — Token persistence to ~/.copilot-token.json with load/save/validate helpers

Modified Files

  • src/router/server.ts — Full Copilot provider integration: auth state hydration, request headers (Openai-Intent, x-initiator, Copilot-Vision-Request, User-Agent), routing in all three endpoints (messages, chat/completions, responses), startup logging
  • src/router/model-mapper.ts — Copilot model mapping and responses API detection (shouldUseCopilotResponsesApi for GPT-5+ models)
  • src/commands.ts — Copilot verify, models listing, OAuth, logout, and status snapshot
  • src/bin/code-router.tsauth copilot, logout copilot commands, updated help text
  • src/legacy-cli.ts — Copilot options in auth/logout/verify menus
  • src/types.ts — Added 'copilot' to Provider union type

Usage

# Authenticate with GitHub Copilot
code-router auth copilot

# Enterprise GitHub
code-router auth copilot --enterprise-url https://github.example.com

# Verify subscription works
code-router verify --provider copilot

# Start the router (auto-detects Copilot auth)
code-router serve

# Route requests through Copilot
curl http://localhost:3344/v1/chat/completions \
  -H 'Content-Type: application/json' \
  -H 'x-provider-hint: copilot' \
  -d '{"model":"gpt-4o","messages":[{"role":"user","content":"Hi"}]}'

API Details

  • Base URL: https://api.githubcopilot.com (or https://copilot-api.<domain> for enterprise)
  • Auth: GitHub OAuth token via Device Flow (client_id: Ov23li8tweQw6odWQebz)
  • Chat Completions API for most models, Responses API for GPT-5+ models
  • Tokens stored at ~/.copilot-token.json (mode 0600)

Testing

  • TypeScript compiles cleanly (tsc --noEmit)
  • OAuth device flow tested end-to-end
  • Subscription verified: gpt-4o responded correctly
  • Router starts with Copilot auth detected in status output
  • Chat completions endpoint tested with live Copilot API

Add support for GitHub Copilot subscriptions alongside the existing
Claude MAX and ChatGPT providers. This enables routing API requests
through Copilot using a GitHub OAuth device flow for authentication.

New files:
- copilot-oauth.ts: GitHub Device Flow OAuth (RFC 8628)
- copilot-token-manager.ts: Token persistence to ~/.copilot-token.json

Changes:
- Router server: Copilot provider routing in all three endpoints
  (messages, chat/completions, responses) with required headers
  (Openai-Intent, x-initiator, Copilot-Vision-Request, User-Agent)
- Model mapper: Copilot model mapping and responses API detection
- CLI: auth copilot, logout copilot, verify, models, status commands
- Legacy CLI: Copilot in auth/logout/verify menus
- Types: Added copilot to Provider union

Usage:
  code-router auth copilot
  code-router verify --provider copilot
  code-router serve

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
@rizqme
Copy link
Copy Markdown
Owner

rizqme commented Mar 16, 2026

Please test in opencode:

  • create custom provider opencode, add modality image
  • check whether image is functioning
  • check priority precedence (which subscription used first), do we need manual override?

Here example opencode provider

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "coderouter": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Code Router",
      "options": {
        "baseURL": "http://localhost:3344/v1"
      },
      "models": {
        "gpt-5.3-codex": {
          "name": "GPT-5.3 Codex",
          "modalities": {
            "input": ["text", "image"],
            "output": ["text"]
          }
        },
        "claude-sonnet-4-6": {
          "name": "Claude Sonnet 4.6",
          "modalities": {
            "input": ["text", "image"],
            "output": ["text"]
          }
        }
      }
    }
  }
}

@rujaksoto
Copy link
Copy Markdown
Author

Testing Results with OpenCode

Setup

Custom provider config in opencode.json:

{
  "provider": {
    "coderouter": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Code Router",
      "options": { "baseURL": "http://localhost:3344/v1" },
      "models": {
        "gpt-4.1": { "name": "GPT-4.1", "modalities": { "input": ["text", "image"], "output": ["text"] } },
        "gpt-4o": { "name": "GPT-4o", "modalities": { "input": ["text", "image"], "output": ["text"] } },
        "gpt-5-mini": { "name": "GPT-5 Mini", "modalities": { "input": ["text", "image"], "output": ["text"] } },
        "claude-sonnet-4": { "name": "Claude Sonnet 4", "modalities": { "input": ["text", "image"], "output": ["text"] } }
      }
    }
  }
}

Copilot Supported Models (tested)

Model Status
gpt-4o ✅ Works
gpt-4.1 ✅ Works
gpt-5-mini ✅ Works
claude-sonnet-4 ✅ Works
gpt-5 ❌ Not supported
gpt-5.3-codex ❌ Not supported
claude-sonnet-4-6 ❌ Not supported
o4-mini ❌ Not supported

1. Text — ✅ Working

$ opencode run -m coderouter/gpt-4.1 "Reply with exactly: hello world"
hello world

$ opencode run -m coderouter/claude-sonnet-4 "Reply with exactly: copilot test ok"
copilot test ok

2. Image Input — ✅ Working

Tested with a 100x100 red PNG:

$ opencode run -m coderouter/gpt-4.1 -f test-image.png -- "What color is this image? Reply with just the color name."
Red

$ opencode run -m coderouter/claude-sonnet-4 -f test-image.png -- "What color is this image? Reply with just the color name."
Red

3. Priority Precedence

Current routing logic in resolveChatProvider():

Priority Condition Provider Selected
1 Only one provider configured That provider
2 GPT model + OpenAI available openai
3 Claude model + Anthropic available anthropic
4 Explicit header hint (x-code-router-provider: copilot) Hinted provider
5 CODE_ROUTER_DEFAULT_CHAT_PROVIDER env var Env value
6 OpenAI API key detected openai
7 Fallback anthropic

Key observation: When Copilot is the only provider, all models route through it correctly (GPT and Claude models alike). When multiple providers are configured, GPT models prefer OpenAI and Claude models prefer Anthropic — Copilot is only selected via explicit hint or as the sole provider.

Manual override options:

  • Header: x-code-router-provider: copilot per-request
  • Env: CODE_ROUTER_DEFAULT_CHAT_PROVIDER=copilot globally
  • No additional override mechanism needed — the env var handles it

Verbose Routing Logs

Routing: model=gpt-4.1 model_provider=openai hinted=none openai_available=false anthropic_available=false resolved_provider=copilot
Routing: model=claude-sonnet-4 model_provider=anthropic hinted=anthropic openai_available=false anthropic_available=false resolved_provider=copilot

When Copilot is the only configured provider, both GPT and Claude models correctly route through it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants