How to cache results from API calls when building SSG pages? Traditional in-memory memoization doesn't seem to work #13765
-
I'm fetching the Airtable API 3 times per page, and I'm building around ~30 pages, which generates about 100 API calls when building the site on Vercel.
I tried implementing a basic in-memory cache system (memoization), but while it does work locally (unit tested), it doesn't work when the site is being build. Here is the cache implementation: It relies on treating the file But it obviously don't work neither on Vercel, neither locally when running Am I missing something? I believe pages are built in a special way that resets the cache between each page. (as logged above, the cache seems to be filled with |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
We may need an example to show how to do this 🤔 You could create a build time only file that stores the data and read from that file instead of repeating the request, we do something like that for our docs. In our docs we fetch the latest tag of the repo to get the manifest of routes, and because that can happen once per documentation page, we cache the result in a file (build time only) to avoid being rate limited: import path from 'path';
import { readFile, writeFile } from '../fs-utils';
import { GITHUB_API_URL, REPO_NAME } from './constants';
const USE_CACHE = process.env.USE_CACHE === 'true';
const TAG_CACHE_PATH = path.resolve('.github-latest-tag');
export async function getLatestTag() {
let cachedTag;
if (USE_CACHE) {
try {
cachedTag = await readFile(TAG_CACHE_PATH, 'utf8');
} catch (error) {
// A cached file is not required
}
}
if (!cachedTag) {
const res = await fetch(`${GITHUB_API_URL}/repos/${REPO_NAME}/releases/latest`);
if (res.ok) {
const data = await res.json();
const tag = data.tag_name;
if (USE_CACHE) {
try {
await writeFile(TAG_CACHE_PATH, tag, 'utf8');
} catch (error) {
// A cached file is not required
}
}
cachedTag = tag;
}
}
return cachedTag;
} (You can also check the above file in our repo) We then set module.exports = {
webpack: (config, { dev, isServer }) => {
if (!dev && isServer) {
// we're in build mode so enable shared caching for the GitHub API
process.env.USE_CACHE = 'true';
}
return config;
}
} (You can also check the above file in our repo) And include the generated cache file ( |
Beta Was this translation helpful? Give feedback.
-
The Next.js docs now describe a solution to this problem using React's
So does Next.js presently recommend using a React Canary release and |
Beta Was this translation helpful? Give feedback.
We may need an example to show how to do this 🤔
You could create a build time only file that stores the data and read from that file instead of repeating the request, we do something like that for our docs. In our docs we fetch the latest tag of the repo to get the manifest of routes, and because that can happen once per documentation page, we cache the result in a file (build time only) to avoid being rate limited: