In December 2024, Google Search Central quietly dropped a figure that raised my eyebrows: only 0.017% of crawled websites are cacheable. As someone who regularly configures caching on my own websites, this statistic feels both shocking and deeply concerning. How can the web be so far behind on something so foundational to performance?
Why Caching Matters
Caching is one of the simplest and most effective ways to improve website performance and reduce server load. When properly configured, caching allows repeat visitors—or even Google's crawlers—to access resources without needing to re-download the same files over and over. This can:
- Improve page load speed
- Reduce bandwidth usage
- Decrease server costs
- Boost Core Web Vitals (which affects SEO)
In other words, caching is a win-win for both user experience and search engine performance.
What Does "Cacheable" Mean to Google?
When Google says only 0.017% of pages are cacheable, they're referring to pages that:
- Have valid HTTP caching headers (e.g.
Cache-Control,ETag,Last-Modified) - Do not return different content on each request (i.e. are not overly dynamic)
- Are not blocked from caching via
no-storeorprivatedirectives
This means that even many pages that seem "static" to us may not actually be cacheable to Google or to browsers.
Common Reasons Sites Aren't Cacheable
- Missing or Incorrect Headers: Developers often forget to add proper
Cache-Controlheaders or use overly conservative defaults. - Overly Dynamic Content: Sites built with CMSs like WordPress, or heavy SPAs, often serve content dynamically without proper variation or cache keys.
- Security Overreach: Some developers use
no-storeorno-cacheheaders site-wide for fear of leaking private content. - CDN Misconfiguration: Sometimes CDNs override or strip cache headers from origin responses, effectively making content uncacheable.
- Legacy Infrastructure: Older servers or monolithic apps may never have been optimized for modern caching strategies.
How You Can Improve Cacheability
If you're building or maintaining websites, here are a few steps to make your content more cache-friendly:
1. Use Cache-Control Wisely
Cache-Control: public, max-age=31536000, immutableFor static assets like images, JS, CSS.
2. Leverage ETags and Last-Modified Headers
These allow for conditional requests and better cache revalidation.
3. Avoid Dynamic Content Where Possible
Or use cache variation (Vary header, cookie-based cache keys) to handle personalization without nuking cacheability.
4. Use a CDN That Respects Cache Headers
Configure your CDN to cache responses correctly and avoid stripping essential headers.
5. Audit With Tools
Use Lighthouse, WebPageTest, or curl to inspect cache headers. For example:
curl -I https://yourwebsite.comThe Bigger Picture: A Missed Opportunity
If only 0.017% of the web is cacheable, it means we're collectively leaving a massive performance win on the table. It also means Google's crawlers must refetch billions of pages unnecessarily, wasting energy and bandwidth. In a world increasingly concerned with sustainability, this matters more than ever. Google has a bunch of articles describing how to cache and other things.
Final Thoughts
Caching isn’t optional—it’s a baseline best practice. Yet clearly, the industry has work to do. Let's not let this stat go unnoticed. Instead, let's use it as motivation to build a faster, leaner, and more efficient web.
Have you audited your cache headers lately? Maybe it's time to start.