Add cost tracking, caching, failover, and 76 middleware modules to your Azure OpenAI requests. One URL change, no SDK swap.
Azure OpenAI gives you OpenAI models hosted on Azure infrastructure with enterprise SLAs and data residency. But it uses a different authentication scheme (api-key header) and URL format than direct OpenAI.
Stockyard handles the Azure-specific auth and URL translation. Your application calls the standard /v1/chat/completions endpoint, and Stockyard routes to your Azure deployment with the correct headers. This also lets you fail over from Azure OpenAI to direct OpenAI or other providers seamlessly.
# Install Stockyard curl -fsSL stockyard.dev/install.sh | sh # Set your Azure OpenAI API key export AZURE_OPENAI_API_KEY=your-key-here # Start the proxy stockyard # Provider: azure-openai (from AZURE_OPENAI_API_KEY) # Proxy listening on :4200 # Send a request through the proxy curl http://localhost:4200/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"model":"GPT-4o","messages":[{"role":"user","content":"hello"}]}'
Set AZURE_OPENAI_API_KEY and AZURE_OPENAI_BASE_URL (your deployment endpoint). Stockyard handles the api-key header format automatically.
If you're on Azure OpenAI today but want the option to use direct OpenAI, Anthropic, or other providers later, Stockyard gives you that exit path. Your app talks to Stockyard, not Azure directly. Add providers over time without changing application code.
# Start with Azure OpenAI export AZURE_OPENAI_API_KEY=... export AZURE_OPENAI_BASE_URL=https://your-deployment.openai.azure.com # Later, add direct OpenAI as failover export OPENAI_API_KEY=sk-... # Stockyard now routes to both. No app changes needed.
Route Azure OpenAI through Stockyard in under 60 seconds.
Install GuideAll 16 providers · Proxy-only mode · What is an LLM proxy? · vs Azure AI Gateway · vs AWS Bedrock