Service caching for Authorize gateway instances
Caching improves decision evaluation performance and reduces latency by storing data retrieved from services for faster retrieval on subsequent service requests.
You can configure gateway instances with a Redis cache. Single instance and cluster (including AWS ElastiCache) Redis modes are supported.
Redis Sentinel isn’t supported. |
Service caching uses a two-level cache to improve performance and minimize redundancy. A local in-memory cache acts as the level-one cache and the Redis instance acts as the level-two cache. When making a call to a service, the gateway instance tries to load the level-one cache entry. If the entry isn’t found, the gateway instance calls the level-two cache. If no entry is found on either level, the gateway instance calls the service itself.
Cached values are per gateway instance and not shared between instances, but you can use the same Redis instance to point towards the same level-two cache. You cannot clear the in-memory cache manually.
Logging is provided for Redis cache connection status and health.
Service caching configuration
Using one of the available configuration methods, pass the caching
JSON object into your gateway instance.
Example caching
object:
{
"caching": {
"external": {
"redis": {
"uri": "redis://localhost:6379/0",
"mode": "SINGLE_INSTANCE",
"auth": {
"username": "admin",
"password": "password"
}
}
}
}
}
Configuration properties for the caching
object:
Property | Description |
---|---|
|
Required. Specifies connection details. In For AWS ElastiCache, use |
|
Required. Specifies the Redis mode. Accepted values are |
|
Required only for |