Introducing Stale Cache Delivery For More Efficient CDN Caching

Posted by:

Content delivery networks have evolved to do many powerful things, but when it comes to delivering massive amounts of data and delivering great performance, there's one thing that comes to mind. Caching. already provides one of the most powerful caching systems, but we're excited today to announce we've made it even better by introduce stale cache delivery.

What is stale cache?

So what is stale cache? When stores a file on the edge, it gets configured with a time to live (TTL) value based on various configurations such as Cache-Control headers, Edge Rules, or override settings. This value determines how long a file will be considered active. During this time, the edge server will always serve the file directly to the end user, without ever having to go back to the origin.

After this value expires, the file is then considered stale cache. Before now, when the next request would fly in, would go back to the origin, check if the file is still valid and only then return it to the end user. This ensures that the TTL is strictly and correctly applied and is designed to force content being updated if necessary.

However, in many cases, static files don't really change that often. Equally, delivering the exactly up to date file for each request might not always be a critical requirement. In these cases, stale cache validation adds unnecessary load, latency, and traffic to the origin server.

Today, we're excited to announce we have a better solution. With a few simple clicks, you are now able to enable stale cache delivery.

Serve from cache, update in the background

With the new system, allows you to continue serving already cached files at all times, and move the revalidation and update logic into the background. With this, we are able to offer significant performance improvements when serving freshly expired content, especially in critical environments with very low TTLs and high load.

Stale means: Loading Files Up To 90%+ Faster

The biggest and most obvious benefit is a huge increase of performance. operates in 70+ datacenters around the world, bringing your content just a few milliseconds away from the majority of users in the world. Your origin on the other hand, might be far far away. By delivering stale content, we can reduce the extra latency for revalidation up to 90% as well as reduce errors and failed requests. will make sure your content gets delivered quickly and successfully, while we elegantly update your files to the latest version in the background.

Stale means: Reduced Origin Traffic

Another big benefit that might be most apparent in situations with high requests per second is reduced traffic on the origin. While cache is expired, multiple concurrent requests to the same files might end up reaching your origin. This can create unnecessary load and more latency while we really only needed to send a single request. With the new feature, this is now a breeze and we can see large potential benefits for highly dynamic cached content.

Stale means: Staying Online When Your Server Is Not

The last, but by no means the least significant benefit lies in reliability. Things fail, and there's no way around it, but what we can do is try to control the effects of such failures. By serving cache and only updating in the background, can continue serving already cached content even if your origin is offline. While not ideal, it's likely that any content is better than no content at all.

By serving stale cache, we can greatly improve reliability of resources and in fact, can even use stale content with "offline only" as a placeholder when things do fail, but return non-cacheable resources when they work.

Hop on and try it out!

Stale cache settings have now been enabled for everyone, and we invite you to check them out! It takes just a click to enable under the Caching settings of your pull zones and we're excited to see how this can help us build an even faster internet!