<h1 align="center">
<a href="https://prompts.chat">
Use your Claude Code subscription with any OpenAI-compatible tool
Sign in to like and favorite skills
Use your Claude Code subscription with any OpenAI-compatible tool
Claude Gateway is an OpenAI-compatible HTTP proxy that lets you use your Claude Code subscription with any tool built for OpenAI's API.
/v1/modelsNo installation needed - run directly with npx:
npx claude-gateway start
That's it! The gateway will:
http://localhost:45554npx claude-gateway start # Start the gateway server npx claude-gateway start --port 8080 # Use custom port npx claude-gateway start --api-key my-secret-key # Enable API key authentication npx claude-gateway start --verbose # Full request logging npx claude-gateway login # Authenticate with OAuth npx claude-gateway status # Check authentication status
git clone https://gitlab.com/soapbox-pub/claude-gateway.git cd claude-gateway npm install npm start
from openai import OpenAI client = OpenAI( api_key="not-used", # Can be anything - gateway handles auth base_url="http://localhost:45554/v1", ) response = client.chat.completions.create( model="claude-sonnet-4-5-20250929", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ] ) print(response.choices[0].message.content)
import OpenAI from 'openai'; const client = new OpenAI({ apiKey: 'not-used', // Can be anything - gateway handles auth baseURL: 'http://localhost:45554/v1', }); const response = await client.chat.completions.create({ model: 'claude-sonnet-4-5-20250929', messages: [ { role: 'system', content: 'You are a helpful assistant.' }, { role: 'user', content: 'Hello!' } ] }); console.log(response.choices[0].message.content);
curl -X POST http://localhost:45554/v1/chat/completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer your-api-key-here" \ -d '{ "model": "claude-sonnet-4-5-20250929", "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ] }'
Note: The Authorization header is only required if you started the server with --api-key
from openai import OpenAI client = OpenAI( api_key="not-used", base_url="http://localhost:45554/v1", ) stream = client.chat.completions.create( model="claude-sonnet-4-5-20250929", messages=[{"role": "user", "content": "Write a story"}], stream=True ) for chunk in stream: if chunk.choices[0].delta.content: print(chunk.choices[0].delta.content, end='')
Claude Gateway supports sending images in your messages using the same format as OpenAI's API:
from openai import OpenAI client = OpenAI( api_key="not-used", base_url="http://localhost:45554/v1", ) # Option 1: Image from URL response = client.chat.completions.create( model="claude-sonnet-4-5-20250929", messages=[ { "role": "user", "content": [ {"type": "text", "text": "What's in this image?"}, { "type": "image_url", "image_url": { "url": "https://example.com/image.jpg" } } ] } ] ) # Option 2: Base64-encoded image response = client.chat.completions.create( model="claude-sonnet-4-5-20250929", messages=[ { "role": "user", "content": [ {"type": "text", "text": "What's in this image?"}, { "type": "image_url", "image_url": { "url": "data:image/jpeg;base64,/9j/4AAQSkZJRg..." } } ] } ] )
Supported formats:
data:image/...;base64,...)More examples: See the
examples/ directory for complete vision examples in both Python and JavaScript.
┌─────────────────────────┐ │ OpenAI-compatible Tool │ │ (Python SDK, etc.) │ └───────────┬─────────────┘ │ POST /v1/chat/completions │ OpenAI format request ▼ ┌──────────────────────────────────────────┐ │ Claude Gateway │ │ 1. Translate OpenAI → Anthropic format │ │ 2. Inject required system prompt │ │ 3. Authenticate with OAuth │ │ 4. Forward to Anthropic API │ │ 5. Translate response back to OpenAI │ └───────────┬──────────────────────────────┘ │ Authenticated request ▼ ┌──────────────────────────┐ │ Anthropic API │ │ (Claude Code billing) │ └──────────────────────────┘
Models can be discovered through the
/v1/models endpoint.
Available Claude models:
claude-opus-4-5-20251101 - Most capableclaude-sonnet-4-5-20250929 - Balanced performanceclaude-haiku-4-5-20251001 - Fast and efficientTo get the current model list:
curl http://localhost:45554/v1/models \ -H "x-api-key: sk-ant-api03-..."
By default, the gateway runs without authentication. To add basic security, use the
--api-key flag:
npx claude-gateway start --api-key your-secret-key-here
When API key authentication is enabled, all requests to
/v1/chat/completions must include the key in the Authorization header:
from openai import OpenAI client = OpenAI( api_key="your-secret-key-here", # Must match the --api-key value base_url="http://localhost:45554/v1", )
Important Notes:
Bearer <key> and raw key formats are supported/v1/models endpoint does not require authenticationnpx claude-gateway start [options]
| Option | Short | Description |
|---|---|---|
| | Set port (default: 45554) |
| Require API key for authentication (optional) | |
| | Full request/response logging |
| | One line per request |
| | No request logging |
| | Show help |
POST /v1/chat/completionsOpenAI Chat Completions API - the main endpoint you'll use.
GET /v1/modelsList available models (requires Anthropic API key in
x-api-key header).
Claude Gateway uses OAuth to authenticate with your Claude Code subscription.
First run:
code#state from the pageSubsequent runs:
"No OAuth tokens found" → Gateway will automatically prompt you to authenticate on first run.
Port already in use → Use
npx claude-gateway start --port 8080
Authentication fails → Delete
.oauth-tokens.json and restart to re-authenticate.
Want to see what's happening? → Use
npx claude-gateway start --verbose
Tool calling behaves differently → Anthropic's tool calling is more strict than OpenAI's. Some parallel tool calls may not work.
"Multiple completions (n > 1) not supported" → Anthropic only returns one completion. Set
n=1 or omit the parameter.
Supported:
Not Supported (Anthropic limitations):
n > 1)logprobs)presence_penalty / frequency_penalty (ignored with warning)Claude Gateway is a fork of anthropic-max-router by David Whatley, simplified to focus on OpenAI compatibility.
Original Author: nsxdavid (David Whatley)
Fork Maintainer: Alex Gleason [email protected]
Special thanks to OpenCode for OAuth implementation reference.
MIT
⚠️ EDUCATIONAL AND RESEARCH PURPOSES
This project is provided for educational, research, and entertainment purposes only. It is not affiliated with, endorsed by, or sponsored by Anthropic PBC. Use of this software is at your own risk. The authors and contributors make no warranties and accept no liability for any damages or issues arising from use of this code. Users are responsible for ensuring their use complies with Anthropic's Terms of Service and all applicable laws. This software is provided "as-is" without any express or implied warranties.