Skip to content

Implement persistent data cache option and update cache key logic #920

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .changeset/slow-walls-allow.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
"@opennextjs/aws": minor
---

Add an option to keep the data cache persistent between deployments.

BREAKING CHANGE: Incremental cache keys are now an object of type `CacheKey` instead of a string. The new type includes properties like `baseKey`, `buildId`, and `cacheType`. Build_id is automatically provided according to the cache type and the `dangerous.persistentDataCache` option. Up to the Incremental Cache implementation to use it as they see fit.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explicitly say that the overrides should be updated?

(Adding this text to the PR description)

151 changes: 75 additions & 76 deletions packages/open-next/src/adapters/cache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,13 @@ import type {
IncrementalCacheContext,
IncrementalCacheValue,
} from "types/cache";
import { getTagsFromValue, hasBeenRevalidated, writeTags } from "utils/cache";
import type { CacheKey } from "types/overrides";
import {
createCacheKey,
getTagsFromValue,
hasBeenRevalidated,
writeTags,
} from "utils/cache";
import { isBinaryContentType } from "../utils/binary";
import { debug, error, warn } from "./logger";

Expand Down Expand Up @@ -31,7 +37,7 @@ function isFetchCache(
// We need to use globalThis client here as this class can be defined at load time in next 12 but client is not available at load time
export default class Cache {
public async get(
key: string,
baseKey: string,
// fetchCache is for next 13.5 and above, kindHint is for next 14 and above and boolean is for earlier versions
options?:
| boolean
Expand All @@ -50,21 +56,22 @@ export default class Cache {
const softTags = typeof options === "object" ? options.softTags : [];
const tags = typeof options === "object" ? options.tags : [];
return isFetchCache(options)
? this.getFetchCache(key, softTags, tags)
: this.getIncrementalCache(key);
? this.getFetchCache(baseKey, softTags, tags)
: this.getIncrementalCache(baseKey);
}

async getFetchCache(key: string, softTags?: string[], tags?: string[]) {
debug("get fetch cache", { key, softTags, tags });
async getFetchCache(baseKey: string, softTags?: string[], tags?: string[]) {
debug("get fetch cache", { baseKey, softTags, tags });
try {
const cachedEntry = await globalThis.incrementalCache.get(key, "fetch");
const key = createCacheKey({ key: baseKey, type: "fetch" });
const cachedEntry = await globalThis.incrementalCache.get(key);

if (cachedEntry?.value === undefined) return null;

const _tags = [...(tags ?? []), ...(softTags ?? [])];
const _lastModified = cachedEntry.lastModified ?? Date.now();
const _hasBeenRevalidated = await hasBeenRevalidated(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should add a comment explaining why the base key is used (inline + on the method)?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re-opening.
Could we expand on why this is not using the obj key?

key,
baseKey,
_tags,
cachedEntry,
);
Expand Down Expand Up @@ -105,9 +112,15 @@ export default class Cache {
}
}

async getIncrementalCache(key: string): Promise<CacheHandlerValue | null> {
async getIncrementalCache(
baseKey: string,
): Promise<CacheHandlerValue | null> {
try {
const cachedEntry = await globalThis.incrementalCache.get(key, "cache");
const key = createCacheKey({
key: baseKey,
type: "cache",
});
const cachedEntry = await globalThis.incrementalCache.get(key);

if (!cachedEntry?.value) {
return null;
Expand All @@ -119,7 +132,7 @@ export default class Cache {
const tags = getTagsFromValue(cacheData);
const _lastModified = cachedEntry.lastModified ?? Date.now();
const _hasBeenRevalidated = await hasBeenRevalidated(
key,
baseKey,
tags,
cachedEntry,
);
Expand Down Expand Up @@ -191,118 +204,104 @@ export default class Cache {
}

async set(
key: string,
baseKey: string,
data?: IncrementalCacheValue,
ctx?: IncrementalCacheContext,
): Promise<void> {
if (globalThis.openNextConfig.dangerous?.disableIncrementalCache) {
return;
}
const key = createCacheKey({
key: baseKey,
type: data?.kind === "FETCH" ? "fetch" : "cache",
});
debug("Setting cache", { key, data, ctx });
// This one might not even be necessary anymore
// Better be safe than sorry
const detachedPromise = globalThis.__openNextAls
.getStore()
?.pendingPromiseRunner.withResolvers<void>();
try {
if (data === null || data === undefined) {
await globalThis.incrementalCache.delete(key);
// only case where we delete the cache is for ISR/SSG cache
await globalThis.incrementalCache.delete(key as CacheKey<"cache">);
} else {
const revalidate = this.extractRevalidateForSet(ctx);
switch (data.kind) {
case "ROUTE":
case "APP_ROUTE": {
const { body, status, headers } = data;
await globalThis.incrementalCache.set(
key,
{
type: "route",
body: body.toString(
isBinaryContentType(String(headers["content-type"]))
? "base64"
: "utf8",
),
meta: {
status,
headers,
},
revalidate,
await globalThis.incrementalCache.set(key, {
type: "route",
body: body.toString(
isBinaryContentType(String(headers["content-type"]))
? "base64"
: "utf8",
),
meta: {
status,
headers,
},
"cache",
);
revalidate,
});
break;
}
case "PAGE":
case "PAGES": {
const { html, pageData, status, headers } = data;
const isAppPath = typeof pageData === "string";
if (isAppPath) {
await globalThis.incrementalCache.set(
key,
{
type: "app",
html,
rsc: pageData,
meta: {
status,
headers,
},
revalidate,
},
"cache",
);
} else {
await globalThis.incrementalCache.set(
key,
{
type: "page",
html,
json: pageData,
revalidate,
},
"cache",
);
}
break;
}
case "APP_PAGE": {
const { html, rscData, headers, status } = data;
await globalThis.incrementalCache.set(
key,
{
await globalThis.incrementalCache.set(key, {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look like this is a formatting only diff?

type: "app",
html,
rsc: rscData.toString("utf8"),
rsc: pageData,
meta: {
status,
headers,
},
revalidate,
});
} else {
await globalThis.incrementalCache.set(key, {
type: "page",
html,
json: pageData,
revalidate,
});
}
break;
}
case "APP_PAGE": {
const { html, rscData, headers, status } = data;
await globalThis.incrementalCache.set(key, {
type: "app",
html,
rsc: rscData.toString("utf8"),
meta: {
status,
headers,
},
"cache",
);
revalidate,
});
break;
}
case "FETCH":
await globalThis.incrementalCache.set(key, data, "fetch");
await globalThis.incrementalCache.set(key, data);
break;
case "REDIRECT":
await globalThis.incrementalCache.set(
key,
{
type: "redirect",
props: data.props,
revalidate,
},
"cache",
);
await globalThis.incrementalCache.set(key, {
type: "redirect",
props: data.props,
revalidate,
});
break;
case "IMAGE":
// Not implemented
break;
}
}

await this.updateTagsOnSet(key, data, ctx);
await this.updateTagsOnSet(baseKey, data, ctx);
debug("Finished setting cache");
} catch (e) {
error("Failed to set cache", e);
Expand Down
69 changes: 47 additions & 22 deletions packages/open-next/src/adapters/composable-cache.ts
Original file line number Diff line number Diff line change
@@ -1,22 +1,48 @@
import type { ComposableCacheEntry, ComposableCacheHandler } from "types/cache";
import type { CacheKey } from "types/overrides";
import { writeTags } from "utils/cache";
import { fromReadableStream, toReadableStream } from "utils/stream";
import { debug } from "./logger";
import { debug, warn } from "./logger";

const pendingWritePromiseMap = new Map<string, Promise<ComposableCacheEntry>>();
/**
* Get the cache key for a composable entry.
* Composable cache keys are a special cases as they are a stringified version of a tuple composed of a representation of the BUILD_ID and the actual key.
* @param key The composable cache key
* @returns The composable cache key.
*/
function getComposableCacheKey(key: string): CacheKey<"composable"> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be merged into createCacheKey?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah probably

try {
const shouldPrependBuildId =
globalThis.openNextConfig.dangerous?.persistentDataCache !== true;
const [_buildId, ...rest] = JSON.parse(key);
return {
cacheType: "composable",
buildId: shouldPrependBuildId ? _buildId : undefined,
baseKey: JSON.stringify(rest),
} as CacheKey<"composable">;
} catch (e) {
warn("Error while parsing composable cache key", e);
// If we fail to parse the key, we just return it as is
// This is not ideal, but we don't want to crash the application
return {
cacheType: "composable",
buildId: process.env.NEXT_BUILD_ID ?? "undefined-build-id",
baseKey: key,
};
}
}

export default {
async get(cacheKey: string) {
async get(key: string) {
try {
const cacheKey = getComposableCacheKey(key);
Copy link
Contributor

@vicb vicb Jul 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to move that l45 and use key on l42 and 43?

edit:
Oh actually I think that's what we want as they might differ.
Should we add a comment warning that key should not be used?

edit2:
But I guess that I have the same question as the tag cache: why wouldn't we use the build?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it doesn't really matter, it's in memory anyway.
For the tag cache, technically we don't need the build_id as it is time based.
There is one case actually when it is useful to have the build_id, it's for more complex deployment (like blue/green).
I think what make sense is just to pass the CacheKey to the tag cache as well and let the implementation decides.
For the default one, I think we should just follow the persistentDataCache option
WDYT ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGTM

// We first check if we have a pending write for this cache key
// If we do, we return the pending promise instead of fetching the cache
if (pendingWritePromiseMap.has(cacheKey)) {
return pendingWritePromiseMap.get(cacheKey);
if (pendingWritePromiseMap.has(cacheKey.baseKey)) {
return pendingWritePromiseMap.get(cacheKey.baseKey);
}
const result = await globalThis.incrementalCache.get(
cacheKey,
"composable",
);
const result = await globalThis.incrementalCache.get(cacheKey);
if (!result?.value?.value) {
return undefined;
}
Expand All @@ -39,7 +65,7 @@ export default {
) {
const hasBeenRevalidated =
(await globalThis.tagCache.getLastModified(
cacheKey,
cacheKey.baseKey,
result.lastModified,
)) === -1;
if (hasBeenRevalidated) return undefined;
Expand All @@ -55,25 +81,24 @@ export default {
}
},

async set(cacheKey: string, pendingEntry: Promise<ComposableCacheEntry>) {
pendingWritePromiseMap.set(cacheKey, pendingEntry);
async set(key: string, pendingEntry: Promise<ComposableCacheEntry>) {
const cacheKey = getComposableCacheKey(key);
pendingWritePromiseMap.set(cacheKey.baseKey, pendingEntry);
const entry = await pendingEntry.finally(() => {
pendingWritePromiseMap.delete(cacheKey);
pendingWritePromiseMap.delete(cacheKey.baseKey);
});
const valueToStore = await fromReadableStream(entry.value);
await globalThis.incrementalCache.set(
cacheKey,
{
...entry,
value: valueToStore,
},
"composable",
);
await globalThis.incrementalCache.set(cacheKey, {
...entry,
value: valueToStore,
});
if (globalThis.tagCache.mode === "original") {
const storedTags = await globalThis.tagCache.getByPath(cacheKey);
const storedTags = await globalThis.tagCache.getByPath(cacheKey.baseKey);
const tagsToWrite = entry.tags.filter((tag) => !storedTags.includes(tag));
if (tagsToWrite.length > 0) {
await writeTags(tagsToWrite.map((tag) => ({ tag, path: cacheKey })));
await writeTags(
tagsToWrite.map((tag) => ({ tag, path: cacheKey.baseKey })),
);
}
}
},
Expand Down
11 changes: 9 additions & 2 deletions packages/open-next/src/core/routing/cacheInterceptor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,11 @@ import type { InternalEvent, InternalResult } from "types/open-next";
import type { CacheValue } from "types/overrides";
import { emptyReadableStream, toReadableStream } from "utils/stream";

import { getTagsFromValue, hasBeenRevalidated } from "utils/cache";
import {
createCacheKey,
getTagsFromValue,
hasBeenRevalidated,
} from "utils/cache";
import { debug } from "../../adapters/logger";
import { localizePath } from "./i18n";
import { generateMessageGroupId } from "./queue";
Expand Down Expand Up @@ -208,7 +212,10 @@ export async function cacheInterceptor(
if (isISR) {
try {
const cachedData = await globalThis.incrementalCache.get(
localizedPath ?? "/index",
createCacheKey({
key: localizedPath ?? "/index",
type: "cache",
}),
);
debug("cached data in interceptor", cachedData);

Expand Down
Loading
Loading