Why I Built My Own Analytics Pipeline (And What It Actually Costs)

Created: • Updated: • 2 min read
Abstract dashboard illustration with charts and soft gradients

Every analytics platform I tested wanted $20-50 per month and a script my readers never agreed to. Free tiers demanded ownership of the data and still loaded third-party JavaScript. That tradeoff fails a lean site that sees a few thousand thoughtful visitors per month.

So I spent a weekend building an analytics stack that rides on Netlify functions, writes JSON into the repo, and keeps visitor data inside my own infrastructure. Zero recurring cost, zero vendor lock-in, and enough fidelity to run content experiments.

Decision guardrails

Before touching code I set three rules:

  1. Privacy first: anonymous session IDs, no cookies, no fingerprinting.
  2. Costs must round down to zero by piggybacking on the existing Netlify deployment.
  3. Data must live in Git so history, audits, and diffing come for free.

Once those constraints were clear, the build sprint collapsed into one ingestion function, one nightly rollup, and one folder of dated JSON files.

How the pipeline runs

Collection. Each page view ships a tiny payload: page path, referrer, device class, Core Web Vitals, and a random session token. It travels through fetch() against the same origin so browsers never block it.

Ingestion. A Netlify function receives every event and appends it to Netlify's key-value store. Think of it as an append-only queue with automatic encryption and retention.

Rollup. At 05:00 UTC a scheduled function drains the queue, aggregates the day, and commits analytics/YYYY/M/D/analytics-scheduled.json. Manual triggers run the same script and tag the file analytics-manual.json so I can spot ad-hoc pulls.

Each JSON report carries meta (date, count, trigger), traffic highlights, performance slices, and a capped set of raw events for debugging. Because Git stores the lot, I can diff traffic weeks later without logging into another dashboard.

Business impact

This homegrown stack beats SaaS at this scale:

Next moves

Now that the raw data lands in Git, I can bolt on Telegram alerts for traffic spikes, publish transparent monthly digests, or feed the numbers into KLETO’s ops dashboards. Shipping content stays the priority, but the instrumentation now grows with me instead of billing me.

If you already run Netlify, roll your own analytics before you reach for another subscription. You will learn more about your own stack, keep your readers' trust, and stop paying for dashboards you barely open.

Recommended

Anthropic Trained Its Replacement ai startups founders
Pydantic: The Open Source Layer Quietly Running the AI Economy ai open-source python pydantic anthropic tools
Karpathy Was Wrong: OpenClaw Still Outruns Its 5 Real Alternatives openclaw ai tools security

Recommended

Anthropic Trained Its Replacement ai startups founders
Pydantic: The Open Source Layer Quietly Running the AI Economy ai open-source python pydantic anthropic tools
Karpathy Was Wrong: OpenClaw Still Outruns Its 5 Real Alternatives openclaw ai tools security