Provider

Route Anthropic through Stockyard

Add cost tracking, caching, failover, and 76 middleware modules to your Anthropic requests. One URL change, no SDK swap.

Environment variable
ANTHROPIC_API_KEY
Models
claude-sonnet-4-5-20250929, claude-3-5-haiku, claude-3-opus
Failover to
OpenAI GPT-4o, Google Gemini, or DeepSeek
API format
OpenAI-compatible

Why proxy Anthropic?

Anthropic uses a different API format than OpenAI. Stockyard translates between them transparently, so you can call Claude through the same /v1/chat/completions endpoint your app already uses for OpenAI. No SDK swap, no format conversion in your code.

This also means you can fail over between Claude and GPT-4o without changing your application. Set up model aliasing to route default-model to Claude, with GPT-4o as the backup.

Quick start

# Install Stockyard
curl -fsSL stockyard.dev/install.sh | sh

# Set your Anthropic API key
export ANTHROPIC_API_KEY=your-key-here

# Start the proxy
stockyard
# Provider: anthropic (from ANTHROPIC_API_KEY)
# Proxy listening on :4200

# Send a request through the proxy
curl http://localhost:4200/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model":"claude-sonnet-4-5-20250929","messages":[{"role":"user","content":"hello"}]}'

Good to know

Anthropic has different token counting and pricing than OpenAI. Stockyard normalizes both so your cost dashboard shows accurate per-request costs regardless of provider.

Use Claude with the OpenAI SDK

Anthropic has its own SDK and API format. With Stockyard, you don't need to use it. Send standard OpenAI-format requests and Stockyard translates them to Anthropic's format automatically:

# Use the OpenAI Python SDK to call Claude
from openai import OpenAI
client = OpenAI(base_url="http://localhost:4200/v1")

response = client.chat.completions.create(
    model="claude-sonnet-4-5-20250929",
    messages=[{"role": "user", "content": "hello"}]
)

This means you can switch between OpenAI and Anthropic models without changing your SDK, your error handling, or your response parsing. Just change the model name.

Route Anthropic through Stockyard in under 60 seconds.

Install Guide

All 16 providers · Proxy-only mode · What is an LLM proxy? · vs LiteLLM · vs Braintrust

Explore: OpenAI · Google Gemini · Install guide