Skip to main content

Version: latest

Connect to Anthropic Claude

Anthropic provides access to Claude models through an API. This guide shows how to route traffic to Anthropic through API7 Gateway using the ai-proxy plugin.

Prerequisites

  • Install Docker.

  • Install cURL to send requests to the services for validation.

  • Have a running API7 Enterprise Gateway instance.

  • Obtain the Admin API key. Save it to an environment variable:

    export ADMIN_API_KEY=your-admin-api-key   # replace with your API key
  • Obtain the ID of the service you want to configure. Save it to an environment variable:

    export SERVICE_ID=your-service-id         # replace with your service ID

Obtain an Anthropic API Key

Create an account and API key by following the Anthropic API documentation. Save the key to an environment variable:

export ANTHROPIC_API_KEY=sk-ant-xxxxxxxxxxxxxxxxxxxxxxxx   # replace with your API key

Configure the AI Proxy for Anthropic

Create a route with the ai-proxy plugin:

curl "http://127.0.0.1:7080/apisix/admin/routes?gateway_group_id=default" -X PUT \
-H "X-API-KEY: $ADMIN_API_KEY" \
-d '{
"id": "anthropic-route",
"service_id": "$SERVICE_ID",
"paths": ["/anthropic"],
"plugins": {
"ai-proxy": {
"provider": "anthropic",
"auth": {
"header": {
"Authorization": "Bearer '"$ANTHROPIC_API_KEY"'"
}
},
"options": {
"model": "claude-sonnet-4-20250514"
}
}
}
}'

❶ Set the provider to anthropic.

❷ Attach the Anthropic API key using the Authorization header.

❸ Set the model to claude-sonnet-4-20250514. See the Anthropic models page for available models.

Validate the Configuration

Send a chat completion request:

curl "http://127.0.0.1:9080/anthropic" -X POST \
-H "Content-Type: application/json" \
-d '{
"messages": [
{ "role": "user", "content": "What is the capital of France?" }
]
}'

You should receive a response similar to the following:

{
"id": "msg_01abc123",
"object": "chat.completion",
"model": "claude-sonnet-4-20250514",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The capital of France is Paris."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 12,
"completion_tokens": 8,
"total_tokens": 20
}
}
note

API7 Gateway normalizes all LLM responses to the OpenAI chat completion format, regardless of the backend provider. Clients always receive a consistent response structure. To use native Anthropic protocol, see Protocol Conversion.

To enable streaming responses, set "stream": true in the request body. Use the proxy-buffering plugin to disable NGINX proxy_buffering to avoid server-sent events (SSE) being buffered.

Next Steps

You have learned how to route traffic to Anthropic through API7 Gateway. See the Anthropic API documentation and Models pages for more details.

API7.ai Logo

The digital world is connected by APIs,
API7.ai exists to make APIs more efficient, reliable, and secure.

Sign up for API7 newsletter

Product

API7 Gateway

SOC2 Type IIISO 27001HIPAAGDPRRed Herring

Copyright © APISEVEN PTE. LTD 2019 – 2026. Apache, Apache APISIX, APISIX, and associated open source project names are trademarks of the Apache Software Foundation