LLM gateway configuration
This page covers how to configure Claude Code with LLM gateway solutions, including LiteLLM setup, authentication methods, and enterprise features like usage tracking and budget management.
Overview
LLM gateways provide a centralized proxy layer between Claude Code and model providers, offering:
- Centralized authentication - Single point for API key management
- Usage tracking - Monitor usage across teams and projects
- Cost controls - Implement budgets and rate limits
- Audit logging - Track all model interactions for compliance
- Model routing - Switch between providers without code changes
LiteLLM configuration
LiteLLM is a third-party proxy service. Anthropic doesn’t endorse, maintain, or audit LiteLLM’s security or functionality. This guide is provided for informational purposes and may become outdated. Use at your own discretion.
Prerequisites
- Claude Code updated to the latest version
- LiteLLM Proxy Server deployed and accessible
- Access to Claude models through your chosen provider
Basic LiteLLM setup
Configure Claude Code:
Authentication methods
Static API key
Simplest method using a fixed API key:
This value will be sent as the Authorization
and Proxy-Authorization
headers, although Authorization
may be overwritten (see Vertex “Client-specified credentials” below).
Dynamic API key with helper
For rotating keys or per-user authentication:
- Create an API key helper script:
- Configure Claude Code settings to use the helper:
- Set token refresh interval:
This value will be sent as Authorization
, Proxy-Authorization
, and X-Api-Key
headers, although Authorization
may be overwritten (see Google Vertex AI through LiteLLM). The apiKeyHelper
has lower precedence than ANTHROPIC_AUTH_TOKEN
or ANTHROPIC_API_KEY
.
Provider-specific configurations
Anthropic API through LiteLLM
Using pass-through endpoint:
Amazon Bedrock through LiteLLM
Using pass-through endpoint:
Google Vertex AI through LiteLLM
Using pass-through endpoint:
Recommended: Proxy-specified credentials
Alternative: Client-specified credentials
If you prefer to use local GCP credentials:
- Authenticate with GCP locally:
- Set Claude Code environment:
- Update LiteLLM header configuration:
Ensure your LiteLLM config has general_settings.litellm_key_header_name
set to Proxy-Authorization
, since the pass-through GCP token will be located on the Authorization
header.
Unified endpoint
Using LiteLLM’s Anthropic format endpoint:
Model selection
By default, the models will use those specified in Model configuration.
If you have configured custom model names in LiteLLM, set the aforementioned environment variables to those custom names.
For more detailed information, refer to the LiteLLM documentation.