Access Hundreds of LLMs via OpenRouter
OpenRouter provides a unified OpenAI-compatible API that gives access to 200+ models from multiple providers. This guide shows how to route traffic to OpenRouter through API7 Gateway using the ai-proxy plugin.
Prerequisites
-
Install Docker.
-
Install cURL to send requests to the services for validation.
-
Have a running API7 Gateway instance.
-
Create a token from the Dashboard and save it to an environment variable:
export API_KEY=your-dashboard-token # replace with your Dashboard token -
Replace
{gateway_group_id}with your gateway group ID. Usedefaultif you are following the quickstart. -
If you are following the Admin API examples, create or reuse a service in API7 Gateway. If you do not have one yet, follow Create or Reuse a Service, then save its ID to an environment variable:
export SERVICE_ID=your-service-id # replace with your service ID
Obtain an OpenRouter API Key
Create an account at openrouter.ai and generate an API key. Save the key to an environment variable:
export OPENROUTER_API_KEY=sk-or-xxxxxxxxxxxxxxxxxxxxxxxx # replace with your OpenRouter API key
Configure the AI Proxy for OpenRouter
Create a route with the ai-proxy plugin:
- Admin API
- ADC
curl -k "https://localhost:7443/apisix/admin/routes?gateway_group_id={gateway_group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-d '{
"id": "openrouter-route",
"service_id": "'"$SERVICE_ID"'",
"paths": ["/openrouter"],
"plugins": {
"ai-proxy": {
"provider": "openrouter",
"auth": {
"header": {
"Authorization": "Bearer '"$OPENROUTER_API_KEY"'"
}
},
"options": {
"model": "anthropic/claude-sonnet-4-20250514"
}
}
}
}'
❶ Set the provider to openrouter.
❷ Attach the OpenRouter API key in the Authorization header.
❸ Set the model using the provider/model format. Browse available models at openrouter.ai/models.
services:
- name: OpenRouter Service
routes:
- uris:
- /openrouter
name: openrouter-route
plugins:
ai-proxy:
provider: openrouter
auth:
header:
Authorization: "Bearer ${OPENROUTER_API_KEY}"
options:
model: anthropic/claude-sonnet-4-20250514
❶ Set the provider to openrouter.
❷ Attach the OpenRouter API key in the Authorization header.
❸ Set the model using the provider/model format. Browse available models at openrouter.ai/models.
Synchronize the configuration to API7 Gateway:
adc sync -f adc.yaml
Multi-Model Routing via OpenRouter
Use ai-proxy-multi to load-balance across different models available through OpenRouter:
- Admin API
- ADC
curl -k "https://localhost:7443/apisix/admin/routes?gateway_group_id={gateway_group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-d '{
"id": "openrouter-multi-route",
"service_id": "'"$SERVICE_ID"'",
"paths": ["/openrouter"],
"plugins": {
"ai-proxy-multi": {
"instances": [
{
"name": "budget-model",
"provider": "openrouter",
"auth": { "header": { "Authorization": "Bearer '"$OPENROUTER_API_KEY"'" } },
"options": { "model": "google/gemini-2.0-flash" },
"weight": 8
},
{
"name": "premium-model",
"provider": "openrouter",
"auth": { "header": { "Authorization": "Bearer '"$OPENROUTER_API_KEY"'" } },
"options": { "model": "anthropic/claude-sonnet-4-20250514" },
"weight": 2
}
]
}
}
}'
❶ Route 80% of traffic to a budget-friendly model.
❷ Route 20% to a premium model for higher-quality responses.
services:
- name: OpenRouter Multi-Model
routes:
- uris:
- /openrouter
name: openrouter-multi-route
plugins:
ai-proxy-multi:
instances:
- name: budget-model
provider: openrouter
auth:
header:
Authorization: "Bearer ${OPENROUTER_API_KEY}"
options:
model: google/gemini-2.0-flash
weight: 8
- name: premium-model
provider: openrouter
auth:
header:
Authorization: "Bearer ${OPENROUTER_API_KEY}"
options:
model: anthropic/claude-sonnet-4-20250514
weight: 2
❶ Route 80% of traffic to a budget-friendly model.
❷ Route 20% to a premium model for higher-quality responses.
Synchronize the configuration to API7 Gateway:
adc sync -f adc.yaml
For more routing strategies, see Multi-LLM Routing and Fallback.
Validate the Configuration
Send a chat completion request:
curl "http://127.0.0.1:9080/openrouter" -X POST \
-H "Content-Type: application/json" \
-d '{
"messages": [
{ "role": "user", "content": "What is an API gateway?" }
]
}'
You should receive a response similar to the following:
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"model": "anthropic/claude-sonnet-4-20250514",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "An API gateway is a server that acts as a single entry point for a set of microservices or APIs, handling request routing, composition, and protocol translation."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 32,
"total_tokens": 42
}
}
To enable streaming responses, set "stream": true in the request body. Use the proxy-buffering plugin to disable NGINX proxy_buffering to avoid server-sent events (SSE) being buffered.
Next Steps
You have learned how to route traffic to OpenRouter through API7 Gateway. See the OpenRouter documentation to learn more about available models and pricing.
- Multi-LLM Routing and Fallback — Combine OpenRouter with direct provider connections for failover.
- Token Rate Limiting — Control spending across models.