Only 0.017% of Websites Are Cacheable? Let's Talk About That

In December 2024, Google Search Central quietly dropped a figure that raised my eyebrows: only 0.017% of crawled websites are cacheable. As someone who regularly configures caching on my own websites, this statistic feels both shocking and deeply concerning. How can the web be so far behind on something so foundational to performance?

Why Caching Matters

Caching is one of the simplest and most effective ways to improve website performance and reduce server load. When properly configured, caching allows repeat visitors—or even Google's crawlers—to access resources without needing to re-download the same files over and over. This can:

  • Improve page load speed
  • Reduce bandwidth usage
  • Decrease server costs
  • Boost Core Web Vitals (which affects SEO)

In other words, caching is a win-win for both user experience and search engine performance.

What Does "Cacheable" Mean to Google?

When Google says only 0.017% of pages are cacheable, they're referring to pages that:

  • Have valid HTTP caching headers (e.g. Cache-Control, ETag, Last-Modified)
  • Do not return different content on each request (i.e. are not overly dynamic)
  • Are not blocked from caching via no-store or private directives

This means that even many pages that seem "static" to us may not actually be cacheable to Google or to browsers.

Common Reasons Sites Aren't Cacheable

  1. Missing or Incorrect Headers: Developers often forget to add proper Cache-Control headers or use overly conservative defaults.
  2. Overly Dynamic Content: Sites built with CMSs like WordPress, or heavy SPAs, often serve content dynamically without proper variation or cache keys.
  3. Security Overreach: Some developers use no-store or no-cache headers site-wide for fear of leaking private content.
  4. CDN Misconfiguration: Sometimes CDNs override or strip cache headers from origin responses, effectively making content uncacheable.
  5. Legacy Infrastructure: Older servers or monolithic apps may never have been optimized for modern caching strategies.

How You Can Improve Cacheability

If you're building or maintaining websites, here are a few steps to make your content more cache-friendly:

1. Use Cache-Control Wisely

Cache-Control: public, max-age=31536000, immutable

For static assets like images, JS, CSS.

2. Leverage ETags and Last-Modified Headers

These allow for conditional requests and better cache revalidation.

3. Avoid Dynamic Content Where Possible

Or use cache variation (Vary header, cookie-based cache keys) to handle personalization without nuking cacheability.

4. Use a CDN That Respects Cache Headers

Configure your CDN to cache responses correctly and avoid stripping essential headers.

5. Audit With Tools

Use Lighthouse, WebPageTest, or curl to inspect cache headers. For example:

curl -I https://yourwebsite.com

The Bigger Picture: A Missed Opportunity

If only 0.017% of the web is cacheable, it means we're collectively leaving a massive performance win on the table. It also means Google's crawlers must refetch billions of pages unnecessarily, wasting energy and bandwidth. In a world increasingly concerned with sustainability, this matters more than ever. Google has a bunch of articles describing how to cache and other things.

Final Thoughts

Caching isn’t optional—it’s a baseline best practice. Yet clearly, the industry has work to do. Let's not let this stat go unnoticed. Instead, let's use it as motivation to build a faster, leaner, and more efficient web.

Have you audited your cache headers lately? Maybe it's time to start.

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
Please share this article on your favorite website or platform.