Zero-Server Analytics: How I Replaced a SaaS Bill with Netlify Functions and GitHub
Every analytics SaaS wants $20-50/month, a tracking pixel, and a cookie banner. For a content site that just needs to know what's getting read and where traffic comes from, that's a bad tradeoff - you're paying recurring fees and adding third-party dependencies for a dashboard you check once a week.
So I built a pipeline that runs itself: Netlify Functions collect events, Blob Store queues them, a scheduled function rolls them up into JSON snapshots, and GitHub stores the results in version control. Total monthly cost: zero. Total maintenance: also zero. If you read How My Automated Analytics Reports Work you already know the high-level story; this is the implementation walkthrough.
The business case in 30 seconds
I needed analytics that were private (no third-party cookies), cheap (no monthly SaaS), portable (data I own in a format I control), and automatic (nothing to remember to run). The constraint was that I'm a one-person operation - I can't afford to babysit infrastructure. Anything I build has to work unattended or it doesn't ship.
Architecture overview
- A lightweight client script (
static/analytics.js) collects page-view events with privacy-friendly payloads. - A Netlify Function (
collect-analytics.js) validates each payload and writes it into Netlify Blob Store. - A scheduled Netlify Function (
rollup-analytics.js) runs daily at 5 AM, drains the previous day's blobs, builds a summary, commits the result as a JSON file to GitHub, and deletes processed events. - Manual runs (
npm run rollup:trigger -- --date YYYY-MM-DD) hit the same function with a token, so I can generate snapshots on demand.
No databases, no third-party analytics services, no dashboards to maintain. Just serverless functions and a Git repo.
Event collection: minimal client footprint
The client script grabs page URL, referrer, UTM params, a hashed visitor ID, and performance metrics, then ships the payload to the collect function. It uses sendBeacon when available so the request doesn't block navigation:
async function sendAnalytics(payload) {
try {
const body = JSON.stringify(payload);
if (navigator.sendBeacon) {
const blob = new Blob([body], { type: 'application/json' });
if (navigator.sendBeacon(ANALYTICS_ENDPOINT, blob)) return;
}
await fetch(ANALYTICS_ENDPOINT, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body });
} catch (err) {
console.warn('analytics send failed', err);
}
}
Because the script lives in static/, the build process copies it straight to dist/static/ - what I debug locally is what runs in production. No transpiler, no bundler surprises. If you're using HTMX for navigation (as I describe in the architecture piece), you can add a htmx:afterSwap listener to emit virtual page views for client-side transitions.
Collect function: validate, salt, stash
The collect function is under 200 lines. It rejects oversized payloads and missing required fields, salts the visitor hash with ANALYTICS_SALT to prevent reverse-engineering, and stores each event as JSON in Blob Store:
const store = getStore('analytics-events');
await store.set(`queue/${date}/${crypto.randomUUID()}.json`, entry);
Blob Store gives me append-only semantics without provisioning a database. Events are organized by date, which makes the daily rollup straightforward. This is the kind of infrastructure decision that matters when you're a small team: pick the simplest storage model that works, and move on.
Rollup function: the daily crunch
The rollup handler detects whether Netlify triggered it on schedule (via x-netlify-event: schedule) or I called it manually with a shared secret. Scheduled runs default to "yesterday"; manual runs accept a date override.
export const handler = schedule('0 5 * * *', rollupHandler);
Inside the handler:
- Load every blob under
queue/{targetDate}/. - Build summary stats - top paths, referrers, device mix, performance percentiles.
- Serialize the dataset and commit it to GitHub via the REST API.
- Delete the processed blobs.
File names include the run type so manual spot-checks never overwrite scheduled reports:
const suffix = runType === 'manual' ? '-manual' : '-scheduled';
const path = `analytics/${year}/${month}/${day}/analytics${suffix}.json`;
Why GitHub as the data store?
This is the part that surprises people, but it makes total business sense:
- Auditability. Every snapshot is version-controlled. I can diff analytics between days just like I diff code.
- Portability. The data is JSON in a Git repo. If I ever want to move platforms, switch tools, or hand the data to a client, there's nothing to export - it's already in an open format.
- Cost. GitHub storage is effectively free at this scale. Netlify Blob Store handles retention automatically once the rollup clears processed events.
- Operational simplicity. My entire site already deploys from GitHub. Keeping analytics alongside the codebase means one fewer system to think about.
For the broader hosting and deployment story, see How This Blog Works.
Authentication and safety rails
GITHUB_TOKENwithreposcope, plusGITHUB_ANALYTICS_REPOandGITHUB_ANALYTICS_BRANCH, all live in Netlify's environment variables.MANUAL_ROLLUP_TOKENis stored both in Netlify and my local shell for on-demand runs.- The rollup function bails early if it can't detect a scheduled invocation and the manual token is missing or wrong.
- If GitHub returns anything other than
200or201, the function logs the payload and rethrows so Netlify marks the invocation as failed - easy to spot, easy to debug.
Testing the pipeline
npm run rollup:trigger -- --date 2025-12-05 hits the production function, passing the token and an optional date. It exercises the same code path as the scheduled run, which makes it perfect for verifying environment variables after a deploy.
{
"meta": {
"date": "2025-12-05",
"runType": "manual",
"events": 63,
"uniqueVisitors": 41
},
"traffic": { ... },
"performance": { ... },
"events": [ ... ]
}
Every JSON snapshot includes the raw events array so I can backfill, audit, or replay if I ever need to.
What's next
- Alerting. Telegram notifications when traffic dips or p95 LCP spikes - the hooks already exist in the codebase.
- Lightweight dashboard. A client-side page that reads the JSON snapshots directly from the repo - no backend needed.
- Multi-site reuse. The same pipeline pointed at different GitHub targets for client sites. The components are already parameterized.
The takeaway
If you're running a content site or a small SaaS marketing page, you probably don't need a $50/month analytics product. Netlify Functions + Blob Store + GitHub gives you a pipeline that costs nothing, runs unattended, and stores data in a format you'll never have to migrate out of. Start with the collect function, wire up the rollup, run the manual trigger once to prove it works, and move on to the stuff that actually grows the business.