Use Claude and GPT across AI coding tools through one local gateway, with both OpenAI-compatible and Claude-native API surfaces.
launchdock gives you one local endpoint for tools like OpenCode, Codex, Claude Code, Droid, and Pi.
Why people use it:
curl -fsSL https://raw.githubusercontent.com/nghyane/launchdock/main/install.sh | sh
launchdock version
Optional:
curl -fsSL https://raw.githubusercontent.com/nghyane/launchdock/main/install.sh | env LAUNCHDOCK_VERSION=v0.1.1 sh
curl -fsSL https://raw.githubusercontent.com/nghyane/launchdock/main/install.sh | env INSTALL_DIR=/usr/local/bin sh
launchdock auth login claude
launchdock auth login openai
launchdock auth list
launchdock start
launchdock launch opencode --config
launchdock launch <tool> checks credentials, starts the local runtime if needed, writes tool config when required, and launches the tool.
Check the local API:
curl http://localhost:8090/v1/models
Supported auth sources:
launchdock auth login claudelaunchdock auth login openaiWhy this auth model is useful:
launchdock auth push [email protected]
ssh [email protected] '$HOME/.local/bin/launchdock start'
auth push installs or updates launchdock on the remote host automatically, then imports your managed credentials.
claude-codecodexopencodedroidpilaunchdock exposes both Claude and GPT models from one local provider.
Claude thinking aliases are also available:
claude-sonnet-4-6-thinkingclaude-opus-4-6-thinkingThese aliases automatically enable Claude thinking for OpenAI-compatible clients that do not serialize Claude thinking config reliably.
launchdock is tested with the official OpenAI Python SDK and works across both Claude and GPT models.
Validated locally with claude-opus-4-6 and gpt-5.4 for:
Supported API surfaces:
/v1/chat/completions/v1/responses/v1/messages| Endpoint | GPT | Claude | Semantics | Tools |
|---|---|---|---|---|
/v1/chat/completions |
✅ | ✅ | OpenAI-compatible chat | ✅ |
/v1/responses |
✅ | ✅ | Native OpenAI Responses semantics | ✅ |
/v1/messages |
— | ✅ | Native Claude Messages format | ✅ |
/v1/chat/completionschoices[].delta.reasoning_content/v1/responsesresponse.reasoning_summary_text.deltareasoning output itemsfunction_call items/events/v1/messagesClaude OAuth internally prefixes tool names with mcp_ when talking to Anthropic.
launchdock strips that prefix before returning tool/function names to clients, so clients still see the original tool name (for example get_weather).
launchdock launch opencode --config merges into an existing OpenCode config instead of overwriting unrelated keys.
For Claude thinking in OpenCode, prefer the model aliases:
claude-sonnet-4-6-thinkingclaude-opus-4-6-thinkingThese are more reliable than trying to rely on client-side Claude thinking config through a custom OpenAI-compatible provider.
launchdock auth
launchdock launch [tool]
launchdock start | ps | logs | restart | stop
launchdock update
launchdock runs on http://localhost:8090.
State lives in:
~/.launchdock/launchdock.pid~/.launchdock/launchdock.log~/.config/launchdock/config.jsonLegacy llm-mux code is preserved on legacy/llm-mux.