-
Notifications
You must be signed in to change notification settings - Fork 600
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
DOCU-2471: Updates proxy-caching get-started guide for 3.0 release
- Loading branch information
Showing
2 changed files
with
154 additions
and
37 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,73 +1,190 @@ | ||
--- | ||
title: Improve Performance with Proxy Caching | ||
title: Proxy Caching | ||
content-type: tutorial | ||
book: get-started | ||
chapter: 4 | ||
--- | ||
|
||
One of the ways Kong delivers performance is through caching. | ||
The [Proxy Cache plugin](/hub/kong-inc/proxy-cache/) accelerates performance by caching | ||
responses based on configurable response codes, content types, and request methods. | ||
When caching is enabled, upstream services are not bogged down with repetitive requests, | ||
because {{site.base_gateway}} responds on their behalf with cached results. Caching can be | ||
enabled on specific {{site.base_gateway}} objects or for all requests globally. | ||
|
||
### Cache Time To Live (TTL) | ||
|
||
TTL governs the refresh rate of cached content, which is critical for ensuring | ||
that clients aren't served outdated content. A TTL of 30 seconds means content older than | ||
30 seconds is deemed expired and will be refreshed on subsequent requests. | ||
TTL configurations should be set differently based on the type of the content the upstream | ||
service is serving. | ||
|
||
One of the ways Kong delivers performance is through caching. The proxy caching plug-in delivers fast performance by providing the ability to cache responses based on requests, response codes, and content type. This means the upstream services are not bogged down with repeated requests, because {{site.base_gateway}} can respond with cached results. {{site.base_gateway}} offers the [Proxy Caching plugin](/hub/kong-inc/proxy-cache/). The Proxy Caching plugin provides this fast performance using a reverse proxy cache implementation. It caches response entities based on the request method, configurable response code, content type, and can cache per consumer or per API. With that level of granularity you can create caching rules that match your specific use case. | ||
* Static data that is rarely updated can have longer TTL | ||
|
||
* Dynamic data should use shorter TTL to avoid serving outdated data | ||
|
||
{{site.base_gateway}} follows [RFC-7234 section 5.2](https://tools.ietf.org/html/rfc7234) | ||
for cached controlled operations. See the specification and the Proxy Cache | ||
plugin [parameter reference](/hub/kong-inc/proxy-cache/#parameters) for more details on TTL configurations. | ||
|
||
## Configure Proxy Cache plugin | ||
## Enable caching | ||
|
||
Configuring the Proxy Caching plugin is done by sending a `POST` request to the admin API and describing the caching rules: | ||
The following tutorial walks through managing proxy caching across various aspects in {{site.base_gateway}}. | ||
|
||
### Prerequisites | ||
|
||
```sh | ||
This chapter is part of the *Get Started with Kong* series. For the best experience, it is recommended that you follow the | ||
series from the beginning. | ||
|
||
curl -i -X POST http://localhost:8001/plugins \ | ||
--data name=proxy-cache \ | ||
--data config.content_type="application/json" \ | ||
--data config.cache_ttl=30 \ | ||
--data config.strategy=memory | ||
Start with the introduction [Get Kong](/gateway/latest/get-started/get-kong), which includes | ||
a list of prerequisites and instructions for running a local {{site.base_gateway}}. | ||
|
||
``` | ||
Step two of the guide, [Services and Routes](/gateway/latest/get-started/services-and-routes), | ||
includes instructions for installing a mock service used throughout this series. | ||
|
||
If configuration was successful, you will receive a `201` response code. The request you sent configured Proxy Caching for all `application/json` content, with a time to live (TTL) of 30 seconds. The final option `config.strategy=memory` specifies where the cache will be stored. You can read more about this option in the strategy section of the [Proxy Caching plugin](/hub/kong-inc/proxy-cache/) documentation. Because this request did not specify a route or a service, {{site.base_gateway}} has applied this configuration globally across all services and routes. The Proxy Caching plugin can also be configured at service-level, route-level, or consumer-level. You can read more about the other configurations and how to apply them in the [Proxy Caching plugin](/hub/kong-inc/proxy-cache/) page. | ||
If you haven't completed these steps already, complete them before proceeding. | ||
|
||
## Validate the configuration | ||
### Global proxy caching | ||
|
||
You can check that the Proxy Caching plugin is working by sending a `GET` request to the route that was created in the [configure services and routes](/gateway/latest/get-started/configure-services-and-routes) guide and examining the headers. If you did not follow the guide, edit the example to reflect your configuration. Send the following request once: | ||
Installing the plugin globally means *every* proxy request to {{site.base_gateway}} | ||
will potentially be cached. | ||
|
||
1. **Enable proxy caching** | ||
|
||
``` | ||
curl -i -X GET http://localhost:8000/mock/request | ||
The Proxy Cache plugin is installed by default on {{site.base_gateway}}, and can be enabled by | ||
sending a `POST` request to the plugins object on the Admin API: | ||
|
||
```sh | ||
curl -i -X POST http://localhost:8001/plugins \ | ||
--data "name=proxy-cache" \ | ||
--data "config.request_method=GET" \ | ||
--data "config.response_code=200" \ | ||
--data "config.content_type=application/json; charset=utf-8" \ | ||
--data "config.cache_ttl=30" \ | ||
--data "config.strategy=memory" | ||
``` | ||
|
||
If configuration was successful, you will receive a `201` response code. | ||
|
||
This Admin API request configured a Proxy Cache plugin for all `GET` requests that resulted | ||
in response codes of `200` and *response* `Content-Type` headers that *equal* | ||
`application/json; charset=utf-8`. `cache_ttl` instructed the plugin to flush values after 30 seconds. | ||
|
||
The final option `config.strategy=memory` specifies the backing data store for cached responses. More | ||
information on `strategy` can be found in the the [parameter reference](/hub/kong-inc/proxy-cache/) | ||
for the Proxy Cache plugin. | ||
|
||
1. **Validate** | ||
|
||
You can check that the Proxy Cache plugin is working by sending `GET` requests and examining | ||
the returned headers. In step two of this guide, [services and routes](/gateway/latest/get-started/services-and-routes), | ||
you setup a `/mock` route and service that can help you see proxy caching in action. | ||
|
||
First, make an initial request to the `/mock` route. The Proxy Cache plugin returns status | ||
information headers prefixed with `X-Cache`, so use `grep` to filter for that information: | ||
|
||
``` | ||
curl -i -s -XGET http://localhost:8000/mock/requests | grep X-Cache | ||
``` | ||
|
||
On the initial request, there should be no cached responses, and the headers will indicate this with | ||
`X-Cache-Status: Miss`. | ||
|
||
``` | ||
X-Cache-Key: c9e1d4c8e5fd8209a5969eb3b0e85bc6 | ||
X-Cache-Status: Miss | ||
``` | ||
|
||
Within 30 seconds of the initial request, repeat the command to send an identical request and the | ||
headers will indicate a cache `Hit`: | ||
|
||
``` | ||
X-Cache-Key: c9e1d4c8e5fd8209a5969eb3b0e85bc6 | ||
X-Cache-Status: Hit | ||
``` | ||
|
||
The `X-Cache-Status` headers can return the following cache results: | ||
|
||
|State| Description| | ||
|---|---| | ||
|Miss| The request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.| | ||
|Hit| The request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.| | ||
|Refresh| The resource was found in cache, but could not satisfy the request, due to Cache-Control behaviors or reaching its hard-coded `cache_ttl` threshold.| | ||
|Bypass| The request could not be satisfied from cache based on plugin configuration.| | ||
|
||
### Service level proxy caching | ||
|
||
The Proxy Cache plugin can be enabled for specific services. The request is the same as above, but the request is sent to the service URL: | ||
|
||
```sh | ||
curl -X POST http://localhost:8001/services/example_service/plugins \ | ||
--data "name=proxy-cache" \ | ||
--data "config.request_method=GET" \ | ||
--data "config.response_code=200" \ | ||
--data "config.content_type=application/json; charset=utf-8" \ | ||
--data "config.cache_ttl=30" \ | ||
--data "config.strategy=memory" | ||
``` | ||
|
||
Depending on your configuration, the response header will be composed of many different fields. Notice the integer values in the following fields: | ||
### Route level proxy caching | ||
|
||
* `X-Kong-Proxy-Latency` | ||
The Proxy Caching plugin can be enabled for specific routes. The request is the same as above, but the request is sent to the route URL: | ||
|
||
* `X-Kong-Upstream-Latency` | ||
```sh | ||
$ curl -X POST http://localhost:8001/routes/mock/plugins \ | ||
--data "name=proxy-cache" \ | ||
--data "config.request_method=GET" \ | ||
--data "config.response_code=200" \ | ||
--data "config.content_type=application/json; charset=utf-8" \ | ||
--data "config.cache_ttl=30" \ | ||
--data "config.strategy=memory" | ||
``` | ||
|
||
If you were to send two requests in succession, the values would be lower in the second request. That demonstrates that the content is cached, and {{site.base_gateway}} is not returning the information directly from the upstream service that your route is pointing to. | ||
### Consumer level proxy caching | ||
|
||
* `X-Cache-Status` | ||
In {{site.base_gateway}}, [consumers](/gateway/latest/admin-api/#consumer-object) are an abstraction that defines a user of a service. | ||
Consumer-level proxy caching can be used to cache responses per consumer. | ||
|
||
This will display `hit` expressing that proxy caching worked correctly, but this header can also return the following states: | ||
1. **Create a consumer** | ||
|
||
|State| Description| | ||
|---|---| | ||
|Miss| The request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.| | ||
|Hit| The request could be satisfied in cache, but an entry for the resource was not found in cache, and the request was proxied upstream.| | ||
|Refresh| The resource was found in cache, but could not satisfy the request, due to Cache-Control behaviors or reaching its hard-coded `cache_ttl` threshold.| | ||
|Bypass| The request could not be satisfied from cache based on plugin configuration.| | ||
Consumers are created using the consumer object in the Admin API. | ||
|
||
In the initial request the value for `config.content_type` was set to "application/json". The proxy cache plugin will only cache the specific data type that was set in the initial configuration request. | ||
```sh | ||
curl -X POST http://localhost:8001/consumers/ \ | ||
--data username=sasha | ||
``` | ||
|
||
1. **Enable caching for the consumer** | ||
|
||
## Time to live | ||
```sh | ||
curl -X POST http://localhost:8001/consumers/sasha/plugins \ | ||
--data "name=proxy-cache" \ | ||
--data "config.request_method=GET" \ | ||
--data "config.response_code=200" \ | ||
--data "config.content_type=application/json; charset=utf-8" \ | ||
--data "config.cache_ttl=30" \ | ||
--data "config.strategy=memory" | ||
``` | ||
|
||
Time to live (TTL) governs the refresh rate of cached content, which ensures that people requesting information from your upstream services aren't served old content. A TTL of 30 seconds means that content is refreshed every 30 seconds. TTL rules should vary based on the resource type of the content the upstream service is serving. | ||
## Manage cached entities | ||
|
||
* Static files that are rarely updated should have a longer TTL. | ||
The Proxy Cache plugin supports administrative endpoints to manage cached entities. Administrators can | ||
view and delete cached entities, or purge the entire cache by sending requests to the Admin API. | ||
|
||
* Dynamic files can use shorter TTLs to account for the complexity in updating. | ||
To retrieve the cached entity, submit a request to the Admin API `/proxy-cache` endpoint with the | ||
`X-Cache-Key` value of a known cached value. This request must be submitted prior to the TTL expiration, | ||
otherwise the cached entity has been purged. | ||
|
||
Kong can store resource entities in the storage engine longer than the prescribed `cache_ttl` or `Cache-Control`values indicate. This allows {{site.base_gateway}} to maintain a cached copy of a resource past its expiration. This allows clients capable of using max-age and max-stale headers to request stale copies of data if necessary. | ||
For example, using the response headers above, pass the `X-Cache-Key` value of | ||
`c9e1d4c8e5fd8209a5969eb3b0e85bc6` to the Admin API: | ||
|
||
```sh | ||
curl -i http://localhost/proxy-cache/c9e1d4c8e5fd8209a5969eb3b0e85bc6 | ||
``` | ||
|
||
A response with `200 OK` will contain full details of the cached entity. | ||
|
||
## Next steps | ||
See the [Proxy Cache plugin documentation](/hub/kong-inc/proxy-cache/#admin-api) for the full list of the | ||
Proxy Cache specific Admin API endpoints. | ||
|
||
Next, you’ll learn about the [Key-Authentication plugin](/gateway/{{page.kong_version}}/get-started/comprehensive/secure-services). |