ADR-003: Explicit Caching
Status
Accepted
Context
Caching is one of the most impactful performance optimizations a framework can provide — and one of the hardest to debug when it goes wrong. Existing frameworks like Next.js cache aggressively by default: fetch() calls are cached, page renders are cached, and data is revalidated on opaque schedules. This implicit behavior leads to a well-documented class of bugs where developers serve stale data without realizing it, struggle to invalidate caches, and spend time fighting the framework rather than building features.
Catmint's design principle — "explicit over implicit: no hidden caching, no magic re-renders" — demands a caching model where every cached response is the result of a deliberate developer decision.
Decision
Nothing is cached by default in Catmint. Every page render, endpoint response, and server function call executes fresh unless the developer explicitly opts in to caching using one of the provided APIs:
cachedRoute(handler, options)
Wraps a page or endpoint to enable time-based caching with tag-based invalidation:
import { cachedRoute } from 'catmint/cache'
export default cachedRoute(async function Page() {
const data = await fetchExpensiveData()
return <div>{data}</div>
}, {
tag: ['homepage', 'featured'],
revalidate: 3000, // seconds
})
Options:
interface CacheOptions {
tag?: string[] // Cache tags for targeted invalidation
revalidate?: number // Seconds before the cache is considered stale
staleWhileRevalidate?: boolean // Serve stale content while revalidating (default: true)
}
staticRoute(handler, options?)
Marks a page for full static pre-rendering at build time:
import { staticRoute } from 'catmint/cache'
export default staticRoute(function AboutPage() {
return <div>About us</div>
})
For dynamic routes, a paths function generates the set of params to pre-render:
export default staticRoute(async function BlogPost({ params }) {
const post = await fetchPost(params.slug)
return <article>{post.content}</article>
}, {
paths: async () => {
const posts = await fetchAllPosts()
return posts.map(p => ({ slug: p.slug }))
},
})
invalidateCache(options)
Explicitly purges cached content by tag or route:
import { invalidateCache } from 'catmint/cache'
await invalidateCache({ tag: 'homepage' })
await invalidateCache({ route: '/blog/[slug]' })
Deployment integration
Cache metadata is exposed to deployment infrastructure via HTTP headers:
| Signal | HTTP Header |
|---|---|
revalidate | Cache-Control: s-maxage=<value>, stale-while-revalidate |
tag | Cache-Tag: <tag1>, <tag2> |
staticRoute() | Pre-rendered as static files at build time |
Rationale
- Debuggability. When caching is explicit, developers can trace exactly which routes are cached, for how long, and with what tags. There is no hidden cache layer to investigate.
- No stale data surprises. A route that doesn't use
cachedRoute()orstaticRoute()always returns fresh data. Developers never need to wonder whether they're seeing a cached response. - Intentional invalidation. Cache invalidation is performed explicitly via
invalidateCache()with specific tags or routes. There is no background revalidation that fires on unknown schedules. - Composable with deployment infrastructure. The
Cache-ControlandCache-Tagheaders generated bycachedRoute()integrate directly with CDN edge caching (Vercel, Cloudflare, etc.) without framework-specific deployment hooks. - Progressive adoption. Developers start with uncached routes (the simplest mental model) and add caching only when they identify performance bottlenecks — rather than starting with implicit caching and trying to opt out.
Alternatives Considered
- Implicit caching with opt-out (Next.js approach). Next.js caches
fetch()responses and page renders by default, requiring developers to opt out withcache: 'no-store'orrevalidate: 0. While this provides good performance out of the box, it creates a debugging burden: developers must understand the caching behavior to avoid serving stale data. The Next.js community has extensively documented frustrations with this model. - Automatic Incremental Static Regeneration (ISR). Automatically revalidating static pages on a timer is convenient but opaque. Developers cannot easily predict when a page will be regenerated, and cache invalidation requires framework-specific APIs tied to the deployment platform rather than standard HTTP semantics.
- Per-fetch caching. Caching individual
fetch()calls (as Next.js does) provides fine-grained control but scatters caching decisions across the codebase. Catmint caches at the route level instead, which is coarser but easier to reason about — a route is either cached or it isn't.
Consequences
Positive:
- Zero caching surprises. Developers always know whether a response is cached by looking at the route's export.
- Cache invalidation is explicit and auditable —
invalidateCache()calls are visible in the code and in logs. - The caching model maps directly to HTTP caching semantics (
Cache-Control,Cache-Tag), making it portable across deployment platforms. - Easier to write tests — uncached routes return deterministic responses without needing to clear caches between test runs.
Negative:
- More boilerplate for common patterns. A page that could be cached with zero effort in Next.js requires wrapping with
cachedRoute()in Catmint. - Developers who forget to add caching may see worse performance than they would with an implicitly-cached framework. This is a deliberate tradeoff — Catmint prefers correctness over accidental performance.
- The framework cannot automatically optimize performance based on usage patterns, since all caching decisions are manual.
