NGINX Single Site Caching
-
Website Development
Website DevelopmentI build websites for growth. Tailored design. Solid development. Built to help you move forward.Learn More View Articles
A rainy Tuesday morning, 5:20am, wide awake; I’d been listening to it for hours. The house smelt of building work, the slight stuffy feeling in the air and the light surface of dust had covered the surfaces. I’d gotten a hot coffee and I’d just opened my laptop to see the day ahead.
The day before, a caching issue caught me off-guard bringing my entire website down. The 2nd patch of downtime that week, the prior being another Cloudflare outage bringing down a third of the internet. I’ve been working to create less dependancies on external services and caching via cloud flare has been one of these. I knew, I needed to move towards a more stable caching mechanism; One that relies on one service; and can be quickly set up in the same configuration on other servers for a quick redeployment of the site.
I’d considered plugins like WP Rocket, WP Fastest Cache and more, but getting as close to the hardware would increase site speed dramatically; something simple caching plugins do not provide. Then, I’d thought about Redis Cache, a popular tool which is known to dramatically increase speed. Turns out, I’d still opt for FastCGI caching. Apache, NGINX and Litespeed all have server side caching tools – and my preferred platform is NGINX with FastCGI, the server side technology I allow my own website to run on.
Scouring WP Beginner, Spinupwp, Linode, and then there’s Linuxbabe, the linux distort user manual of the web. I’d eventually come across the correct setup. I had always been familiar with the terminal, ever since my childhood I’d been able to boot in to MS-DOS and launch programs like QuickBasic; so SSH’ing in to the server had been a breeze; “<— Welcome to Ubuntu 24.04.3 LTS, Memory usage: 42%, System Load: 0.0 —>”, and now the text cursor continued to blink at me. Alt-tab back to Linuxbabe, I’d found the basic structure of the FastCGI cache and had planned a modified implementation to handle the specifics of my server. I should have probably used AI, ChatGPT or similar but my natural curiosity for self-seeking information, and figuring out the technical grit has swept me in to a traditional process similar to searching for knowledge in a library.
Defining a Cache Zone and Directory
sudo nano /etc/nginx/sites-available/christophernathaniel.co.uk
Nano not vim. Another habit. Id browsed to the configuration file, according to the documentation I’d need to add a fastcgi_cache_path and a fastcgi_cache_key.
# Paste in to the server { … } block
fastcgi_cache_path /etc/nginx/cache/christophernathaniel levels=1:2 keys_zone=christopher_cache:100m inactive=60m use_temp_path=off;
fastcgi_cache_key "$scheme$request_method$host$request_uri";
I broke it down; fastcgi_cache_path is the cache location, great. That’ll be /etc/nginx/cache/christophernathaniel. Then we have levels defined at 1:2 and skip past its meaning, and then similarly use keys_zone , named christophers_cache and is defined at 100m which defines the shared memory zone. An inactive time, 60m. It’ll clear older files than this, and full back on other caching mechanisms such as a WordPress Plugin as a result – it’ll avoid having to ‘rm -rf’ the caching directory.
And then fastcgi_cache_key, from the documentation its given be a pre-configured suggestion. But as it makes me aware, it’s a string used to index cached pages.
Setting Directory permissions
sudo mkdir -p /etc/nginx/cache/christophernathaniel
sudo chown -R www-data:www-data /etc/nginx/cache
Creating the cache directory and setting the permissions; without it the server would simply fail to render content that it’s expecting to serve up from the cache. I knew the commands, and the user accounts, So I navigated to /etc/nginx/cache and created the files and set there permissions. Great.
Afterwards I logged in, modified content and loaded a page. The page rendered but there was a problem; my new content didn’t show. I was logged in as the WordPress administrator and as with most caching tools I’d expected it not to cache. I’d also not set up anyway of clearing cached content so I’d left myself with two options; wait 60 minutes for the NGINX cache to expire or go back to sites-available and create custom rules to stop caching.
Skipping Cache for Admin & Dynamic Pages
set $skip_cache 0;
# Never cache POST requests (they have dynamic effects)
if ($request_method = POST) {
set $skip_cache 1;
}
# Skip caching for any request with a query string (e.g. search or filter URLs)
if ($query_string != "") {
set $skip_cache 1;
}
# Don’t cache admin or special pages (WordPress admin, login, feeds, etc.)
if ($request_uri ~* "/wp-admin/|/xmlrpc.php|wp-..php|^/feed/|/tag/.|/sitemap.*.(xml|xsl)") {
set $skip_cache 1;
}
# Skip caching for logged-in users or recent commenters (identified by cookies)
if ($http_cookie ~* "comment_author|wordpress_[a-f0-9]+|wp-postpass|wordpress_logged_in") {
set $skip_cache 1;
}
The next test would be adding those rules and testing the output. I ran curl from my local machine just to make sure I wasn’t imagining things.
curl -I https://christophernathaniel.co.uk
The response header returned and there it was:
X-FastCGI-Cache: MISS
First time hit, makes sense. I refreshed again.
X-FastCGI-Cache: HIT
Perfect.
But it didn’t last long. I changed another bit of content, and again, the old version still served. Even logged out, even in incognito, even after a hard refresh. At that point, I knew I needed a manual clear.
The thing about the FastCGI cache is that it’s fast, very fast, but it doesn’t know what you’ve just done inside WordPress. It’s sitting there with its old copy of the page and won’t move unless it’s told to. So I nuked it.
sudo rm -rf /etc/nginx/cache/christophernathaniel/*
Gone. Like it never happened. It felt good, actually. And for the first time, I saw what I’d just changed. My content was there, visible. But the reality is I couldn’t do that every time I changed a paragraph.
I tried the NGINX Helper plugin. You tell it where your cache lives and it listens out for changes. On publish, it purges. On comment, it clears just that post. Not perfect, but better. I still don’t trust it completely.
This whole setup now, it works. The server serves. The cache caches. But the best part? It’s my stack. I know where the files are. I know how to blow them away when they get in the way. There’s something in that.
It’s not finished. I still need to look at Redis. I want to test object caching. Maybe even automate full page cache clears via webhooks. But not today.
Today the site is fast, and I’ve got the morning to myself. The coffee’s gone cold. Rain’s still there.