Proxy OpenRouter Requests
OpenRouter provides a unified, OpenAI-compatible API that aggregates models from multiple providers (OpenAI, Anthropic, Google, Mistral, and more).
This guide shows how to integrate APISIX with OpenRouter using the ai-proxy plugin. With provider set to openrouter, you do not need to set a custom endpoint.
Prerequisite(s)
- Install Docker.
- Install cURL to send requests to the services for validation.
- Follow the Getting Started Tutorial to start a new APISIX instance in Docker or on Kubernetes.
Obtain an OpenRouter API Key
Create an account and API key by following the OpenRouter Quickstart. Optionally save the key to an environment variable:
export OPENROUTER_API_KEY=sk-or-v1-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # replace with your API key
Create a Route to OpenRouter
Create a route with the ai-proxy plugin as such:
- Admin API
- ADC
- Ingress Controller
curl "http://127.0.0.1:9180/apisix/admin/routes" -X PUT -d '{
"id": "openrouter-chat",
"uri": "/anything",
"plugins": {
"ai-proxy": {
"provider": "openrouter",
"auth": {
"header": {
"Authorization": "Bearer '"$OPENROUTER_API_KEY"'"
}
},
"options": {
"model": "deepseek/deepseek-chat"
}
}
}
}'
❶ Set the provider to openrouter.
❷ Attach the OpenRouter API key using the Authorization header.
❸ Set a model supported by OpenRouter, for example deepseek/deepseek-chat.
services:
- name: OpenRouter Service
routes:
- uris:
- /anything
name: openrouter-chat
plugins:
ai-proxy:
provider: openrouter
auth:
header:
Authorization: "Bearer sk-or-v1-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
options:
model: deepseek/deepseek-chat
❶ Set the provider to openrouter.
❷ Attach the OpenRouter API key using the Authorization header.
❸ Set a model supported by OpenRouter, for example deepseek/deepseek-chat.
Synchronize the configuration to APISIX:
adc sync -f adc.yaml
Create a Kubernetes manifest file to configure a route:
- Gateway API
- APISIX CRD
apiVersion: apisix.apache.org/v1alpha1
kind: PluginConfig
metadata:
namespace: ingress-apisix
name: ai-proxy-plugin-config
spec:
plugins:
- name: ai-proxy
config:
provider: openrouter
auth:
header:
Authorization: "Bearer sk-or-v1-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
options:
model: deepseek/deepseek-chat
---
apiVersion: gateway.networking.k8s.io/v1
kind: HTTPRoute
metadata:
namespace: ingress-apisix
name: openrouter-chat
spec:
parentRefs:
- name: apisix
rules:
- matches:
- path:
type: Exact
value: /anything
filters:
- type: ExtensionRef
extensionRef:
group: apisix.apache.org
kind: PluginConfig
name: ai-proxy-plugin-config
apiVersion: apisix.apache.org/v2
kind: ApisixRoute
metadata:
namespace: ingress-apisix
name: openrouter-route
spec:
ingressClassName: apisix
http:
- name: openrouter-route
match:
paths:
- /anything
plugins:
- name: ai-proxy
enable: true
config:
provider: openrouter
auth:
header:
Authorization: "Bearer sk-or-v1-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
options:
model: deepseek/deepseek-chat
❶ Set the provider to openrouter.
❷ Attach the OpenRouter API key using the Authorization header.
❸ Set a model supported by OpenRouter, for example deepseek/deepseek-chat.
Apply the configuration to your cluster:
kubectl apply -f openrouter-route.yaml
Verify
Send a request with the following prompts to the route:
curl "http://127.0.0.1:9080/anything" -X POST \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "system",
"content": "You are a computer scientist."
},
{
"role": "user",
"content": "Explain in one sentence what a Turing machine is."
}
]
}'
You should receive a response similar to the following:
{
"id": "gen-1770023173-XYUZ4kUwUAWHwDMPLN20",
"provider": "Novita",
"model": "deepseek/deepseek-chat",
...
"choices": [
{
"logprobs": null,
"finish_reason": "stop",
"native_finish_reason": "stop",
"index": 0,
"message": {
"role": "assistant",
"content": "A Turing machine is a theoretical computational model that manipulates symbols on an infinite tape according to a set of rules, simulating any algorithm's logic and serving as the foundation for modern computability theory.",
"refusal": null,
"reasoning": null
}
}
],
...
}
Next Steps
You have learned how to integrate APISIX with OpenRouter. See the OpenRouter Quickstart and Models pages for more details.
If you would like to stream responses, enable streaming in your request and use the proxy-buffering plugin to disable NGINX proxy_buffering to avoid server-sent events (SSE) being buffered.