Claude Code Router (CCR) is a powerful proxy server that sits between Claude Code and any LLM provider, enabling you to route AI coding requests to models from OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, SiliconFlow, and more — without requiring an Anthropic account. It transforms requests and responses in real-time so that Claude Code works seamlessly with non-Anthropic backends, while also providing intelligent scenario-based routing, a plugin system for custom transformers, a preset marketplace, and a web-based management UI.

What Problem Does It Solve?
Claude Code is a remarkably capable AI coding assistant, but it is tightly coupled to Anthropic's API. Claude Code Router decouples that relationship by acting as a local translation layer. When Claude Code sends a request formatted for Anthropic's API, CCR intercepts it, applies the appropriate transformations for your chosen provider, forwards it, then translates the response back into the format Claude Code expects. This means you can use Claude Code's full feature set — tool calling, streaming, thinking/reasoning, sub-agents — with virtually any LLM provider that offers an OpenAI-compatible chat completions endpoint.
Sources: README.md, packages/core/package.json
Architecture at a Glance
Claude Code Router is organized as a pnpm monorepo with four main packages, each responsible for a distinct layer of the system. The following diagram illustrates how a typical request flows from Claude Code through the router to an upstream LLM provider and back again.
