HAL: Cutting 100-300KB of JavaScript by Moving Routing to Build Time

Created: • Updated: • 3 min read
Laptop with code on screen

Every kilobyte of JavaScript you ship has a cost: download time, parse time, compile time, and failure modes. For content sites, the single biggest chunk of unnecessary JS is usually the client-side router -- and the framework runtime it requires.

HAL (HTMX Auto-Linking) is a build-time transform that eliminates that cost entirely. It rewrites internal links during the build so navigation happens via HTMX fragment swaps, with zero incremental runtime JavaScript. Here's what that looks like in practice and why it matters.

The cost you're actually paying

If you're running a typical SPA stack for a content site, your routing layer alone costs:

Component Gzipped Size
SPA router (React Router, Vue Router, etc.) 15-40KB
Framework runtime (React, Vue, etc.) 80-200KB
Total routing infrastructure 100-300KB

That's 100-300KB of JavaScript your visitors download, parse, and execute before they see a single word of content. On a 3G connection, that's 2-5 seconds of delay. On a fast connection, it's still unnecessary work the browser has to do.

HAL's cost: 0KB incremental runtime. It's a build-time transform on your links, paired with HTMX (~14KB gzipped) that you're already shipping for content swaps. The router disappears from the bundle entirely.

How it works

At build time, HAL scans internal links and rewrites them to use HTMX attributes:

<!-- Before (authored) -->
        <a href="/article/getting-started-with-web-components">Web Components Guide</a>
        
        <!-- After (built) -->
        <a href="/article/getting-started-with-web-components"
           hx-get="/article/getting-started-with-web-components/fragment.html"
           hx-push-url="/article/getting-started-with-web-components"
           hx-target="#main-content"
           hx-swap="innerHTML">Web Components Guide</a>
        

The authored HTML stays clean. The build step adds the HTMX attributes. Navigation feels instant (fragment swap, no page reload), but there's no client-side router to download, initialize, or debug.

For the full architecture context, see How This Blog Works. For the HTMX progressive enhancement playbook, see HTMX and Progressive Enhancement.

What you get

Faster pages. Less JavaScript means faster TTFB, faster LCP, and less time spent parsing and compiling code. Google measures this. Users feel it.

Fewer failure modes. No client router state to sync. No hydration mismatches. No "the page loaded but navigation is broken because the JS bundle failed." If JavaScript fails entirely, links still work -- they just do full page loads instead of fragment swaps. That's progressive enhancement doing its job.

Better SEO. Full HTML pages exist for every route. Search engines see complete content on every URL. No rendering gymnastics required.

Simpler debugging. When navigation breaks in an SPA, you're debugging router state, history API interactions, and framework lifecycle hooks. When HAL navigation breaks, you're looking at an HTML attribute. The complexity surface is dramatically smaller.

The opt-out escape hatch

Not every link should do a fragment swap. External links, download links, and demo pages that need full page loads can opt out:

<!-- Full page load, no HTMX -->
        <a href="/article/htmx-auto-linking-hal" data-no-htmx>HAL intro (full page)</a>
        

The data-no-htmx attribute tells the build step to leave the link alone. Simple, explicit, no magic.

The business tradeoff

The question isn't "is HAL technically clever?" It's "what does this save me?"

For content sites, marketing sites, and documentation sites -- anything where the primary job is serving pre-rendered content -- this is a straightforward win. For complex applications with authenticated routes, deep linking into app state, and complex navigation guards, a proper client-side router earns its weight. Pick the right tool for the problem.

Recommended

Anthropic Trained Its Replacement ai startups founders
Pydantic: The Open Source Layer Quietly Running the AI Economy ai open-source python pydantic anthropic tools
Karpathy Was Wrong: OpenClaw Still Outruns Its 5 Real Alternatives openclaw ai tools security

Recommended

Anthropic Trained Its Replacement ai startups founders
Pydantic: The Open Source Layer Quietly Running the AI Economy ai open-source python pydantic anthropic tools
Karpathy Was Wrong: OpenClaw Still Outruns Its 5 Real Alternatives openclaw ai tools security