There’s a category of problem that doesn’t belong in your application and doesn’t warrant a server. Redirects, rewrites, authentication gates, rate limiting, localisation routing — these are infrastructure concerns, not business logic. Handling them inside your app adds latency, cost, and complexity. Handling them with a dedicated server adds something you now have to maintain.
Cloudflare Workers sit in between. They’re small JavaScript functions that run at the edge — meaning on Cloudflare’s global network, as close to your user as possible, before a request ever hits your origin server. No containers. No cold starts. No infrastructure to manage.
What Is a Cloudflare Worker, Actually?
A Worker is a serverless function that intercepts HTTP requests and lets you inspect, modify, rewrite, or respond to them entirely. It runs on V8 isolates (the same engine that powers Chrome), which means startup is near-instant — typically under a millisecond — and execution happens in over 300 locations worldwide.
You write standard JavaScript (or TypeScript), deploy via Wrangler (Cloudflare’s CLI), and that function is globally distributed immediately.
export default {
async fetch(request) {
const url = new URL(request.url);
// Intercept, rewrite, proxy — whatever you need
return fetch(url.toString());
}
};
The key thing to understand: Workers execute before your origin. They’re not a plugin inside your app — they’re a layer in front of it.
Why Use Them?
Performance. Processing a request at the edge, 30ms from the user, is always faster than routing it to an origin server in a single data centre. For tasks that don’t need your database — redirects, header manipulation, auth checks — there’s no reason to pay the latency cost of hitting your origin at all.
Separation of concerns. Infrastructure logic inside your application is technical debt. The more routing, rewriting, and gating you bake into WordPress or Laravel, the harder those apps are to maintain and migrate. Workers let you move that logic out cleanly.
No server overhead. You’re not provisioning instances, managing uptime, or paying for idle compute. Workers only run when there’s a request to handle.
Globalisation without complexity. If you need different behaviour for different regions, Workers give you geolocation data on every request — without building any of that detection into your app.
Real-World Use Cases
URL and Domain Rewriting
This is what sparked this post. On a recent project, we needed to map regional subdomains to path-based URLs on a shared platform:
uk.example.com→example.com/ukeu.example.com→example.com/euus.example.com→example.com/us
So a product URL like uk.example.com/product/widget would transparently serve example.com/uk/product/widget — without a redirect, without touching the origin’s routing logic, and without the user seeing the URL change.
Workers handle this cleanly. You intercept the request, extract the subdomain, rewrite the URL, and proxy it forward. The origin sees a clean path-based request. The user never knows.
This is a common problem for SaaS platforms that want clean regional URLs but run a single codebase with path-based routing. Solving it at the app level means every environment needs the mapping logic. Solving it at the edge means you write it once.
export default {
async fetch(request) {
const url = new URL(request.url);
const subdomain = url.hostname.split('.')[0]; // 'uk', 'eu', 'us'
const regionMappings = ['uk', 'eu', 'us', 'au'];
if (regionMappings.includes(subdomain)) {
const baseDomain = url.hostname.replace(`${subdomain}.`, '');
url.hostname = baseDomain;
url.pathname = `/${subdomain}${url.pathname}`;
return fetch(url.toString(), request);
}
return fetch(request);
}
};
A/B Testing and Feature Flags
Split traffic between variants at the edge without touching your application code. Assign a user to a cohort, set a cookie, rewrite the URL, and proxy to the right version — all before your server sees the request.
Authentication Gates
Check a JWT or session cookie at the edge and return a 401 before the request hits your origin. Useful for protecting admin areas, staging environments, or gated content — without adding auth logic to every app behind the domain.
Geolocation-Based Routing
Cloudflare exposes request.cf.country on every request. Route users to region-specific content, apply compliance rules, or show localised pricing — without geolocation libraries inside your app.
Rate Limiting and Bot Protection
Inspect request patterns at the edge and block or throttle before spam, scrapers, or brute-force attempts reach your origin. Workers integrate with Cloudflare’s KV storage for counters and state.
Edge-Side Caching Logic
Modify cache behaviour per request — bypass cache for authenticated users, set custom TTLs based on URL patterns, or serve stale content while revalidating in the background.
Header Manipulation
Add security headers (CSP, HSTS, X-Frame-Options), strip internal headers before they reach the client, or inject tracking parameters — all without touching your server config.
How to Set One Up
Prerequisites: A Cloudflare account with your domain’s DNS managed through Cloudflare. Workers require Cloudflare to be your DNS provider (or at minimum, your traffic to be proxied through them).
1. Install Wrangler
Cloudflare’s CLI handles everything from local development to deployment.
npm install -g wrangler
wrangler login
2. Create a new Worker project
wrangler init my-worker
cd my-worker
This gives you a basic project structure with a wrangler.toml config file and an src/index.js entry point.
3. Write your Worker
Edit src/index.js. The fetch handler receives every request matched by your Worker’s route.
4. Configure routes
In wrangler.toml, define which requests trigger your Worker:
name = "my-worker"
main = "src/index.js"
compatibility_date = "2024-01-01"
[[routes]]
pattern = "*.example.com/*"
zone_name = "example.com"
Routes support wildcards, so you can match subdomains, specific paths, or everything.
5. Test locally
wrangler dev
Wrangler spins up a local environment that mirrors production closely. You can test request handling before touching live traffic.
6. Deploy
wrangler deploy
That’s it. Your Worker is live globally within seconds.
What Does It Cost?
Cloudflare Workers pricing is one of the better deals in cloud infrastructure.
Free tier:
- 100,000 requests per day
- 10ms CPU time per request
- Up to 30 Workers
For most small projects and internal tooling, this is enough.
Workers Paid Plan — $5/month:
- 10 million requests included
- 30ms CPU time per request (extensible)
- $0.30 per additional million requests after that
For context: a busy e-commerce site doing 10 requests per page load, 500 visitors a day, hits roughly 150,000 requests a day. The paid plan comfortably handles that with room to spare.
CPU time limits are worth understanding. The free tier’s 10ms is for CPU execution only — wall-clock time (waiting for a fetch to your origin) doesn’t count against it. So a Worker that rewrites a URL and proxies a request uses almost no CPU time, even if the origin takes 200ms to respond.
Workers KV (key-value storage for stateful Workers like rate limiting): $0.50/million reads, $5/million writes, with generous free tier allowances.
When Workers Are the Wrong Tool
Workers are not application servers. They’re not the right place for complex business logic, database queries (though Cloudflare D1 is changing this), or anything that needs more than a few milliseconds of CPU time per request.
If you find yourself building significant logic into a Worker, it’s usually a sign that the logic belongs in your application — or that you need a proper backend service, not an edge function.
They’re also only useful if Cloudflare is handling your DNS/proxy. If your domain isn’t running through Cloudflare, you can’t use Workers without migrating it.
The Broader Point
The shift toward edge computing isn’t a trend — it’s a correction. Too much logic ended up in monolithic applications because the infrastructure to handle it elsewhere didn’t exist, or was too complex to justify. Workers remove that justification.
Routing, rewriting, auth gates, header manipulation — these don’t belong inside your WordPress or Laravel codebase. They belong at the network layer, running close to users, decoupled from your application’s release cycle.
If you’re not already thinking about what edge logic could sit in front of your web projects, it’s worth starting.
Want to talk through whether Cloudflare Workers could simplify something on your platform? Get in touch today
Curtis Williams
Managing Director, Agitate Digital
Curtis' Bio:
I'm the MD of Agitate Digital, a performance-obsessed web and software agency based in Bournemouth. With a background leading marketing teams, I built Agitate to be the technical partner I wished I'd had — one that thinks in business outcomes, not just deliverables. I specialise in high-performance WordPress and Laravel builds, and I'm increasingly focused on how AI is changing what's possible for growing businesses online.