Configure Proxy Cache
The proxy-cache plugin provides a way to cache upstream responses in the Gateway. This significantly improves response times for frequently requested resources and reduces the load on your backend services.
Prerequisites
- An API7 Enterprise instance is running.
- A Gateway Group is created and a Gateway instance is running.
- A token from the Dashboard.
- If you want to use a custom cache zone, configure it in your Gateway's Nginx configuration (for example,
disk_cache_oneormemory_cache).
Configure Disk-Based Proxy Caching
By default, the plugin uses disk-based caching, which is persistent across Gateway restarts.
- Admin API
- ADC
curl -k "https://localhost:7443/apisix/admin/services/proxy-cache-service?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "proxy-cache-service",
"upstream": {
"type": "roundrobin",
"scheme": "http",
"nodes": [
{
"host": "httpbin.org",
"port": 80,
"weight": 100
}
]
}
}'
curl -k "https://localhost:7443/apisix/admin/routes/proxy-cache-route?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "proxy-cache-route",
"methods": ["GET"],
"paths": ["/anything/cache"],
"service_id": "proxy-cache-service",
"plugins": {
"proxy-cache": {
"cache_strategy": "disk",
"hide_cache_headers": false
}
}
}'
services:
- name: proxy-cache-service
upstream:
nodes:
- host: httpbin.org
port: 80
weight: 1
routes:
- name: proxy-cache-route
uris:
- /anything/cache
methods:
- GET
plugins:
proxy-cache:
cache_strategy: disk
hide_cache_headers: false
adc sync -f adc.yaml
Configure Memory-Based Proxy Caching
Memory-based caching provides faster access than disk-based caching but is not persistent across restarts.
- Admin API
- ADC
curl -k "https://localhost:7443/apisix/admin/services/memory-proxy-cache-service?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "memory-proxy-cache-service",
"upstream": {
"type": "roundrobin",
"scheme": "http",
"nodes": [
{
"host": "httpbin.org",
"port": 80,
"weight": 100
}
]
}
}'
curl -k "https://localhost:7443/apisix/admin/routes/memory-proxy-cache-route?gateway_group_id={group_id}" -X PUT \
-H "X-API-KEY: ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"name": "memory-proxy-cache-route",
"methods": ["GET"],
"paths": ["/anything/cache-memory"],
"service_id": "memory-proxy-cache-service",
"plugins": {
"proxy-cache": {
"cache_strategy": "memory",
"cache_zone": "memory_cache",
"hide_cache_headers": false,
"cache_ttl": 60
}
}
}'
services:
- name: memory-proxy-cache-service
upstream:
nodes:
- host: httpbin.org
port: 80
weight: 1
routes:
- name: memory-proxy-cache-route
uris:
- /anything/cache-memory
methods:
- GET
plugins:
proxy-cache:
cache_strategy: memory
cache_zone: memory_cache
hide_cache_headers: false
cache_ttl: 60
adc sync -f adc.yaml
Validate the Configuration
Send multiple requests to the route:
curl -i "http://127.0.0.1:9080/anything/cache"
curl -i "http://127.0.0.1:9080/anything/cache"
Check the response headers. If hide_cache_headers is false, you will see an Apisix-Cache-Status header:
MISS: The response was not found in the cache and was fetched from the upstream.HIT: The response was served from the cache.EXPIRED: The cached response exists but has expired; a new one was fetched.
For the memory-based example, request http://127.0.0.1:9080/anything/cache-memory twice. The first response should include Apisix-Cache-Status: MISS and the second should include Apisix-Cache-Status: HIT.
If the route returns 404 immediately after you apply the configuration, wait a few seconds for the latest configuration to reach the gateway and retry.
Next Steps
- Configure Rate Limiting — protect your system from traffic spikes.
- Configure Proxy Mirror — test your application with real production traffic.
- Configure Response Rewrite — modify response data before returning to the client.