News & Announcements

Introducing the Netlify Cache API

Netlify’s global caching infrastructure has been helping millions of developers build web applications with best-in-class performance and maximum flexibility. It has democratized features like cache tags, introduced full control over cache keys for optimal hit ratios and pioneered a globally-persisted caching layer that works with any framework (or without one). Always with a relentless focus on leveraging web standards and reducing vendor-specific API surfaces.

Today, we’re adding another item to that toolbox by launching a beta version of the Netlify Cache API.

Why another caching primitive?

When building applications with Netlify Functions or Netlify Edge Functions, developers can leverage the full power of our caching layer by returning a response with caching headers. This ensures that functions don’t even run for requests that can be served from the cache, offering a more performant experience to users and a more cost-efficient operation.

When functions do have to run, they often make HTTP requests to third-party services or APIs. These HTTP calls introduce additional points of failure, since the remote resources can be slow or unavailable, suffer from rate-limiting or quota enforcement, and generate significant costs.

These are the classic problems that are solved with caching. But how do you cache these individual network calls triggered by your functions when you can’t serve the entire response from the cache? This is now possible with the Netlify Cache API.

Built on web standards

Unlike the existing caching primitives that live in the CDN and which end clients (like your browser) interact with, the Netlify Cache API was built to be accessed programmatically from a JavaScript or TypeScript application.

The example below shows how you could write a function that accepts a URL and gets the corresponding response, either from the cache (if available) or from the network. When fetching from the network, it stores the result in the cache so it can be leveraged in subsequent invocations.

const cache = await caches.open("my-cache")

async function getFromCacheOrFetch(url: string) {
	const cached = await cache.match(url)
	if (cached) {
	  return cached
	}
	
	const res = await fetch(url)
	
	if (res.ok) {
	  await cache.put(url, res.clone())
	}
	
	return res
}

If you’re squinting your eyes looking for the Netlify-specific APIs, let me tell you that there are none. The entire feature is built on top of the web standard Cache and CacheStorage APIs and it’s a perfect companion to the Fetch API.

By building our primitives on web standards, we make them predictable and easy to reason about. This means that if you learn the web platform, you’ll instinctively know how to use our platform.

To control how long a response will be cached for and how cache keys are generated from requests, you can use the same HTTP configuration options supported by our caching infrastructure, including on-demand revalidation.

Batteries included

We’ve also created a new @netlify/cache package that gives you full control over the entire Netlify caching layer with maximum convenience.

This includes a fetchWithCache function that lets you fetch a resource from the cache with the network as a fallback, persisting the response in the cache for future invocations with the option to configure the exact caching behavior.

import { fetchWithCache, HOUR } from "@netlify/cache"

const response = await fetchWithCache("https://example.com", {
  ttl: 12 * HOUR,
  tags: ["product", "sale"]
})

We want to give developers the best of both worlds: the core API built on web standards lets you use the platform rather than getting locked in to a specific vendor, and the @netlify/cache package unlocks the DX and AX that Netlify is known and loved for.

Works with your favorite framework

It works just fine without one, but the Cache API has been designed to support any web framework — like TanStack Start. The example below shows how you could use it to cache API calls in the Basic Start template by changing a few lines of code.

// src/routes/api/users.ts
import { fetchWithCache } from '@netlify/cache'
import { json } from '@tanstack/react-start'
import { createAPIFileRoute } from '@tanstack/react-start/api'
import type { User } from '../../utils/users'

export const APIRoute = createAPIFileRoute('/api/users')({
  GET: async ({ request }) => {
    console.info('Fetching users... @', request.url)
    const res = await fetch('https://jsonplaceholder.typicode.com/users') 
    const res = await fetchWithCache('https://jsonplaceholder.typicode.com/users') 

    if (!res.ok) {
      throw new Error('Failed to fetch users')
    }

    const data = (await res.json()) as Array<User>

    const list = data.slice(0, 10)

    return json(list.map((u) => ({ id: u.id, name: u.name, email: u.email })))
  },
})

Optionally, you could pass a cache settings object to configure how long to cache the response, set any cache tags, etc.

The Cache API documentation page has a dedicated section with information about using the Cache API with your framework.

Demo time

To get a sense of the impact that the Cache API may have on your application, head over to our demo site. (Dress code: space suit.)

It’s also a good example of how easy it is to use the feature with a framework — in this case, Astro.

Progress indicator from Cache API demo site

Available in beta today

The Netlify Cache API available as a beta feature on all plans starting today. Refer to the documentation for the full API reference and more details about the primitive.

Keep reading

Recent posts

How do the best dev and marketing teams work together?

OSZAR »