Skip to main content

Version: latest

Configure Proxy Cache

The proxy-cache plugin provides a way to cache upstream responses in the Gateway. This significantly improves response times for frequently requested resources and reduces the load on your backend services.

Prerequisites

  • An API7 Enterprise instance is running.
  • A Gateway Group is created and a Gateway instance is running.
  • A token from the Dashboard.
  • If you want to use a custom cache zone, configure it in your Gateway's Nginx configuration (for example, disk_cache_one or memory_cache).

Configure Disk-Based Proxy Caching

By default, the plugin uses disk-based caching, which is persistent across Gateway restarts.

curl -k "https://localhost:7443/apisix/admin/services/proxy-cache-service?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "proxy-cache-service",
"upstream": {
"type": "roundrobin",
"scheme": "http",
"nodes": [
{
"host": "httpbin.org",
"port": 80,
"weight": 100
}
]
}
}'

curl -k "https://localhost:7443/apisix/admin/routes/proxy-cache-route?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "proxy-cache-route",
"methods": ["GET"],
"paths": ["/anything/cache"],
"service_id": "proxy-cache-service",
"plugins": {
"proxy-cache": {
"cache_strategy": "disk",
"hide_cache_headers": false
}
}
}'

Configure Memory-Based Proxy Caching

Memory-based caching provides faster access than disk-based caching but is not persistent across restarts.

curl -k "https://localhost:7443/apisix/admin/services/memory-proxy-cache-service?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "memory-proxy-cache-service",
"upstream": {
"type": "roundrobin",
"scheme": "http",
"nodes": [
{
"host": "httpbin.org",
"port": 80,
"weight": 100
}
]
}
}'

curl -k "https://localhost:7443/apisix/admin/routes/memory-proxy-cache-route?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "memory-proxy-cache-route",
"methods": ["GET"],
"paths": ["/anything/cache-memory"],
"service_id": "memory-proxy-cache-service",
"plugins": {
"proxy-cache": {
"cache_strategy": "memory",
"cache_zone": "memory_cache",
"hide_cache_headers": false,
"cache_ttl": 60
}
}
}'

Validate the Configuration

Send multiple requests to the route:

curl -i "http://127.0.0.1:9080/anything/cache"
curl -i "http://127.0.0.1:9080/anything/cache"

Check the response headers. If hide_cache_headers is false, you will see an Apisix-Cache-Status header:

  • MISS: The response was not found in the cache and was fetched from the upstream.
  • HIT: The response was served from the cache.
  • EXPIRED: The cached response exists but has expired; a new one was fetched.

For the memory-based example, request http://127.0.0.1:9080/anything/cache-memory twice. The first response should include Apisix-Cache-Status: MISS and the second should include Apisix-Cache-Status: HIT.

If the route returns 404 immediately after you apply the configuration, wait a few seconds for the latest configuration to reach the gateway and retry.

Next Steps

API7.ai Logo

The digital world is connected by APIs,
API7.ai exists to make APIs more efficient, reliable, and secure.

Sign up for API7 newsletter

Product

API7 Gateway

SOC2 Type IIISO 27001HIPAAGDPRRed Herring

Copyright © APISEVEN PTE. LTD 2019 – 2026. Apache, Apache APISIX, APISIX, and associated open source project names are trademarks of the Apache Software Foundation