Open Source

One proxy.
All AI coding agents.

Use Claude Code, OpenAI Codex, Gemini and 10+ providers through Cursor or any OpenAI-compatible client. Centralize subscriptions, track usage, control access.

Get Started View on GitHub
# Start the proxy
code-proxy

# Point Cursor at your public URL
Base URL: https://your-server.com/v1

# Pick any model — prefix selects the provider
cc/claude-sonnet-4-6      # Claude Code agent (subscription)
codex/5.3                 # Codex agent (subscription)
anthropic/claude-opus-4-6 # Anthropic API (pay-per-token)
openai/gpt-5.3            # OpenAI API (pay-per-token)

CLI subscription vs API tokens

AI coding tools have two fundamentally different modes. Code Proxy lets you use both through a single endpoint.

CLI Mode (Subscription)

Full Code Agent

Spawns the actual claude or codex binary. The AI reads your files, runs commands, edits code. Uses your subscription — not API tokens.

cc/claude-sonnet-4-6 codex/5.3 gc/gemini-2.5-pro
API Mode (Pay-per-token)

Chat Only

Direct HTTP to provider API. Answers questions, reviews code. Uses API keys or OAuth — pay per token consumed.

anthropic/claude-opus-4-6 openai/gpt-5.3 gemini/gemini-2.5-pro

Everything you need

🔗

Unified endpoint

Claude, GPT, Gemini, DeepSeek, Groq, Together, Ollama — all through /v1/chat/completions.

👥

Multi-account pooling

Connect multiple accounts per provider. Auto round-robin, cooldown on rate limits, automatic OAuth token refresh.

📊

Usage analytics

Track requests, tokens, and estimated costs per model, per account, per API key. Full request log with filtering.

🔑

Access control

Create API keys for team members, toggle or revoke access instantly. Dashboard password protection.

🌐

Cloudflare tunnel

Built-in tunnel support with Named Tunnel for a fixed URL. Share with your team without port forwarding or nginx.

Effort levels

Append :low to :max to any CLI model to control how thoroughly the agent works.

Supported providers

Claude Code CLI
Anthropic API API
OpenAI Codex CLI
OpenAI API API
Gemini CLI CLI
Gemini API API
DeepSeek API
Groq API
Together API
Ollama Local
Any OpenAI-compatible API

Get started in 60 seconds

From source

git clone https://github.com/rodrigorodriguescosta/code-proxy.git
cd code-proxy
go build -o code-proxy .
./code-proxy

Go install

go install github.com/rodrigorodriguescosta/code-proxy@latest
code-proxy

Download binary

# Grab from GitHub Releases
wget .../code-proxy-linux-amd64
chmod +x code-proxy-linux-amd64
./code-proxy-linux-amd64

Deploy on VPS

# Single binary, no dependencies
export PORT=3456
export PROXY_REQUIRE_API_KEY=true
./code-proxy

# Built-in Cloudflare tunnel
# Enable in Dashboard → Settings
Cursor does not support localhost. You must expose Code Proxy via a public URL — deploy on a VPS with a public IP, or use a Cloudflare Named Tunnel with a custom domain (e.g. proxy.yourdomain.com). Quick Tunnels generate random URLs that change on every restart — use a Named Tunnel for a fixed URL.

Configure Cursor

1 Start Code Proxy and open dashboard at http://localhost:3456 — connect your accounts
2 Expose via VPS or enable a Named Tunnel (Dashboard → Tunnel) for a fixed public URL
3 Cursor → Settings → Models → set Base URL to https://your-public-url/v1
4 Pick a model with the right prefix — done!
Rodrigo Rodrigues

Built by Rodrigo Rodrigues

Software Engineer — full-stack Go + Vue.js. Creator of Code Proxy and Sigeflex ERP.

@rodrigorodriguescosta