Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
96 changes: 73 additions & 23 deletions packages/next/src/server/app-render/app-render.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ import {
workUnitAsyncStorage,
type PrerenderStore,
} from './work-unit-async-storage.external'
import { consoleAsyncStorage } from './console-async-storage.external'
// import { consoleAsyncStorage } from './console-async-storage.external'
import { CacheSignal } from './cache-signal'
import { getTracedMetadata } from '../lib/trace/utils'
import { InvariantError } from '../../shared/lib/invariant-error'
Expand Down Expand Up @@ -2454,7 +2454,7 @@ async function renderToStream(
// We only have a Prerender environment for projects opted into cacheComponents
cacheComponents
) {
const [resolveValidation, validationOutlet] = createValidationOutlet()
// const [resolveValidation, validationOutlet] = createValidationOutlet()
let debugChannel: DebugChannelPair | undefined
const getPayload = async (
// eslint-disable-next-line @typescript-eslint/no-shadow
Expand All @@ -2472,7 +2472,7 @@ async function renderToStream(
// even if we end up discarding a render and restarting,
// because we're not going to wait for the stream to complete,
// so leaving the validation unresolved is fine.
payload._validation = validationOutlet
// payload._validation = validationOutlet

if (isBypassingCachesInDev(renderOpts, requestStore)) {
// Mark the RSC payload to indicate that caches were bypassed in dev.
Expand Down Expand Up @@ -2545,17 +2545,17 @@ async function renderToStream(
// TODO(restart-on-cache-miss):
// This can probably be optimized to do less work,
// because we've already made sure that we have warm caches.
consoleAsyncStorage.run(
{ dim: true },
spawnDynamicValidationInDev,
resolveValidation,
tree,
ctx,
res.statusCode === 404,
clientReferenceManifest,
requestStore,
devValidatingFallbackParams
)
// consoleAsyncStorage.run(
// { dim: true },
// spawnDynamicValidationInDev,
// resolveValidation,
// tree,
// ctx,
// res.statusCode === 404,
// clientReferenceManifest,
// requestStore,
// devValidatingFallbackParams
// )
} else {
// This is a dynamic render. We don't do dynamic tracking because we're not prerendering
const RSCPayload: RSCPayload & RSCPayloadDevProperties =
Expand Down Expand Up @@ -3044,7 +3044,7 @@ async function renderWithRestartOnCacheMissInDev(
// then we'll only use this render for filling caches.
// We won't advance the stage, and thus leave dynamic APIs hanging,
// because they won't be cached anyway, so it'd be wasted work.
if (maybeStream === null || cacheSignal.hasPendingReads()) {
if (true || maybeStream === null || cacheSignal.hasPendingReads()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The hardcoded true condition forces this branch to always execute, effectively bypassing the cache validation logic. This appears to be intentional for testing the new chunk capture implementation, but should be removed before merging to preserve the original cache-based conditional behavior. The surrounding code changes suggest this is part of a proof-of-concept for the streaming capture technique.

Suggested change
if (true || maybeStream === null || cacheSignal.hasPendingReads()) {
if (maybeStream === null || cacheSignal.hasPendingReads()) {

Spotted by Graphite Agent

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (true || maybeStream === null || cacheSignal.hasPendingReads()) {
if (maybeStream === null || cacheSignal.hasPendingReads()) {

A debug condition true || has been left in the code, which forces the cache miss path to always be taken regardless of actual cache state.

View Details

Analysis

Debug condition forces cache miss path unconditionally in renderWithRestartOnCacheMissInDev()

What fails: The condition at line 3047 in packages/next/src/server/app-render/app-render.tsx contains if (true || ...) which unconditionally evaluates to true, causing the function to always discard render results and skip advancing to the Dynamic rendering stage, even when caches are properly warmed.

How to reproduce: Any render in development mode using the renderWithRestartOnCacheMissInDev() function path would be affected. The debug condition prevents normal cache-hit scenarios from proceeding to the Dynamic stage:

// Lines 3044-3050 in app-render.tsx - BEFORE FIX
if (true || maybeStream === null || cacheSignal.hasPendingReads()) {
  return null
}

// Should be:
if (maybeStream === null || cacheSignal.hasPendingReads()) {
  return null
}

Result: Every render is treated as a cache miss scenario (returns null), preventing the stage controller from advancing to Dynamic stage even when no cache misses occur.

Expected: According to the comments and logic flow, when there are no cache misses (both maybeStream !== null AND !cacheSignal.hasPendingReads()), the function should advance to the Dynamic stage and return the stream for use.

Evidence: Git history shows this debug code was added in commit c6a2889 ("One technique to capture static and runtime chunks while rendering as fast as possible"), where the condition was explicitly changed from if (maybeStream === null || cacheSignal.hasPendingReads()) to if (true || maybeStream === null || cacheSignal.hasPendingReads()).

return null
}

Expand Down Expand Up @@ -3108,6 +3108,9 @@ async function renderWithRestartOnCacheMissInDev(
// We're not using it, so we need to create a new one.
debugChannel = setReactDebugChannel && createDebugChannel()

const staticChunks: Array<Uint8Array> = []
const runtimeChunks: Array<Uint8Array> = []

const finalRscPayload = await getPayload(requestStore)
const finalServerStream = await workUnitAsyncStorage.run(requestStore, () =>
pipelineInSequentialTasks(
Expand All @@ -3124,19 +3127,66 @@ async function renderWithRestartOnCacheMissInDev(
}
)
},
(stream) => {
// Runtime stage
finalStageController.advanceStage(RenderStage.Runtime)
return stream
async (stream) => {
const [continuationStream, staticStream] = stream.tee()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you benchmark this

this would likely only be faster if you can move it one layer up, to access the chunks before they get pushed

otherwise this will likely create a large number of promises that make it slower

const reader = staticStream.getReader()
Promise.resolve().then(() => {
process.nextTick(() => {
// Runtime stage
reader.releaseLock()
finalStageController.advanceStage(RenderStage.Runtime)
})
})

try {
while (true) {
const { done, value } = await reader.read()
if (done) {
break
}
staticChunks.push(value)
}
} catch (e) {
// When we release the lock we may reject the read
}
return continuationStream
},
(stream) => {
// Dynamic stage
finalStageController.advanceStage(RenderStage.Dynamic)
return stream
async (stream) => {
// We make a very important but sublte assumption here
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a typo in the comment: sublte should be subtle. This appears to be a critical comment explaining an important assumption about the timing behavior of promises and microtasks in the stream handling logic.

Suggested change
// We make a very important but sublte assumption here
// We make a very important but subtle assumption here

Spotted by Graphite Agent

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

// The task above returns a promise but it will still return in a microtask
// This is true because we schedule a releaseLock in a nextTick from
// the microtask queue which will cause the first non-microtasky read
// to reject in a microtask allowing the whole job to wrap up
const [continuationStream, runtimeStream] = (await stream).tee()
const reader = runtimeStream.getReader()
Promise.resolve().then(() => {
process.nextTick(() => {
// Dynamic stage
reader.releaseLock()
finalStageController.advanceStage(RenderStage.Dynamic)
})
})

try {
while (true) {
const { done, value } = await reader.read()
if (done) {
break
}
runtimeChunks.push(value)
}
} catch (e) {
// When we release the lock we may reject the read
}
return continuationStream
}
)
)

let dec = new TextDecoder()
console.log({ staticChunks: staticChunks.map((c) => dec.decode(c)) })
console.log({ runtimeChunks: runtimeChunks.map((c) => dec.decode(c)) })
Comment on lines +3187 to +3188
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
console.log({ staticChunks: staticChunks.map((c) => dec.decode(c)) })
console.log({ runtimeChunks: runtimeChunks.map((c) => dec.decode(c)) })
if (process.env.NEXT_PRIVATE_DEBUG_CACHE) {
console.log({ staticChunks: staticChunks.map((c) => dec.decode(c)) })
console.log({ runtimeChunks: runtimeChunks.map((c) => dec.decode(c)) })
}

Debug console.log statements are present in production code, logging decoded stream chunks on every render.

View Details

Analysis

Unguarded debug console.log statements in renderWithRestartOnCacheMissInDev()

What fails: Lines 3187-3188 in packages/next/src/server/app-render/app-render.tsx contain unguarded console.log() statements that log decoded stream chunks to the console on every development render.

How to reproduce:

// In renderWithRestartOnCacheMissInDev() after final render completes (line ~3186):
let dec = new TextDecoder()
console.log({ staticChunks: staticChunks.map((c) => dec.decode(c)) })
console.log({ runtimeChunks: runtimeChunks.map((c) => dec.decode(c)) })

When this code path executes during development (process.env.NODE_ENV === 'development' with cacheComponents enabled), these console.log statements execute unconditionally.

Result: Debug logging is output to the console on every development render. While this code is development-only (protected by runtime guards at the call site), the pattern in the same file shows that debug logs should be wrapped in process.env.NEXT_PRIVATE_DEBUG_CACHE checks (as seen at lines 1942, 2135).

Expected behavior: Debug console.log statements should follow the established pattern in the codebase and be wrapped in if (process.env.NEXT_PRIVATE_DEBUG_CACHE) guards, consistent with similar debug logging elsewhere in the same file.

Fix applied: Wrapped the console.log statements in a process.env.NEXT_PRIVATE_DEBUG_CACHE guard to match the pattern used for other debug logging in this file.


if (process.env.NODE_ENV === 'development' && setCacheStatus) {
setCacheStatus('filled', htmlRequestId, requestId)
}
Expand Down
25 changes: 24 additions & 1 deletion test/e2e/app-dir/hello-world/app/page.tsx
Original file line number Diff line number Diff line change
@@ -1,3 +1,26 @@
import { cookies } from 'next/headers'
import { Suspense } from 'react'

export default function Page() {
return <p>hello world</p>
return (
<main>
<div>this is static</div>
<Suspense fallback="loading...">
<Dynamic />
</Suspense>
<Suspense fallback="loading...">
<Runtime />
</Suspense>
</main>
)
}

async function Dynamic() {
await new Promise((resolve) => setTimeout(resolve, 1000))
return <p>hello dynamic</p>
}

async function Runtime() {
await cookies()
return <p>hello runtime</p>
}
7 changes: 6 additions & 1 deletion test/e2e/app-dir/hello-world/next.config.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
/**
* @type {import('next').NextConfig}
*/
const nextConfig = {}
const nextConfig = {
cacheComponents: true,
experimental: {
// reactDebugChannel: true,
},
}

module.exports = nextConfig
Loading