Skip to content

Commit dc485c7

Browse files
authored
[Flight] Fix detached ArrayBuffer error when streaming typed arrays (#34849)
Using `renderToReadableStream` in Node.js with binary data from `fs.readFileSync` (or `Buffer.allocUnsafe`) could cause downstream consumers (like compression middleware) to fail with "Cannot perform Construct on a detached ArrayBuffer". The issue occurs because Node.js uses an 8192-byte Buffer pool for small allocations (< 4KB). When React's `VIEW_SIZE` was 2KB, files between ~2KB and 4KB would be passed through as views of pooled buffers rather than copied into `currentView`. ByteStreams (`type: 'bytes'`) detach ArrayBuffers during transfer, which corrupts the shared Buffer pool and causes subsequent Buffer operations to fail. Increasing `VIEW_SIZE` from 2KB to 4KB ensures all chunks smaller than 4KB are copied into `currentView` (which uses a dedicated 4KB buffer outside the pool), while chunks 4KB or larger don't use the pool anyway. Thus no pooled buffers are ever exposed to ByteStream detachment. This adds 2KB memory per active stream, copies chunks in the 2-4KB range instead of passing them as views (small CPU cost), and buffers up to 2KB more data before flushing. However, it avoids duplicating large binary data (which copying everything would require, like the Edge entry point currently does in `typedArrayToBinaryChunk`). Related issues: - vercel/next.js#84753 - vercel/next.js#84858
1 parent c35f6a3 commit dc485c7

File tree

3 files changed

+47
-13
lines changed

3 files changed

+47
-13
lines changed

packages/react-server-dom-webpack/src/__tests__/ReactFlightDOMNode-test.js

Lines changed: 36 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,11 +10,11 @@
1010

1111
'use strict';
1212

13+
import fs from 'fs';
14+
import os from 'os';
15+
import path from 'path';
1316
import {patchSetImmediate} from '../../../../scripts/jest/patchSetImmediate';
1417

15-
global.ReadableStream =
16-
require('web-streams-polyfill/ponyfill/es6').ReadableStream;
17-
1818
let clientExports;
1919
let webpackMap;
2020
let webpackModules;
@@ -1136,4 +1136,37 @@ describe('ReactFlightDOMNode', () => {
11361136
'Switched to client rendering because the server rendering errored:\n\nssr-throw',
11371137
);
11381138
});
1139+
1140+
// This is a regression test for a specific issue where byte Web Streams are
1141+
// detaching ArrayBuffers, which caused downstream issues (e.g. "Cannot
1142+
// perform Construct on a detached ArrayBuffer") for chunks that are using
1143+
// Node's internal Buffer pool.
1144+
it('should not corrupt the Node.js Buffer pool by detaching ArrayBuffers when using Web Streams', async () => {
1145+
// Create a temp file smaller than 4KB to ensure it uses the Buffer pool.
1146+
const file = path.join(os.tmpdir(), 'test.bin');
1147+
fs.writeFileSync(file, Buffer.alloc(4095));
1148+
const fileChunk = fs.readFileSync(file);
1149+
fs.unlinkSync(file);
1150+
1151+
// Verify this chunk uses the Buffer pool (8192 bytes for files < 4KB).
1152+
expect(fileChunk.buffer.byteLength).toBe(8192);
1153+
1154+
const readable = await serverAct(() =>
1155+
ReactServerDOMServer.renderToReadableStream(fileChunk, webpackMap),
1156+
);
1157+
1158+
// Create a Web Streams WritableStream that tries to use Buffer operations.
1159+
const writable = new WritableStream({
1160+
write(chunk) {
1161+
// Only write one byte to ensure Node.js is not creating a new Buffer
1162+
// pool. Typically, library code (e.g. a compression middleware) would
1163+
// call Buffer.from(chunk) or similar, instead of allocating a new
1164+
// Buffer directly. With that, the test file could only be ~2600 bytes.
1165+
Buffer.allocUnsafe(1);
1166+
},
1167+
});
1168+
1169+
// Must not throw an error.
1170+
await readable.pipeTo(writable);
1171+
});
11391172
});

packages/react-server/src/ReactServerStreamConfigEdge.js

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,11 @@ export function flushBuffered(destination: Destination) {
3737
// transform streams. https://github.com/whatwg/streams/issues/960
3838
}
3939

40-
const VIEW_SIZE = 2048;
40+
// Chunks larger than VIEW_SIZE are written directly, without copying into the
41+
// internal view buffer. This must be at least half of Node's internal Buffer
42+
// pool size (8192) to avoid corrupting the pool when using
43+
// renderToReadableStream, which uses a byte stream that detaches ArrayBuffers.
44+
const VIEW_SIZE = 4096;
4145
let currentView = null;
4246
let writtenBytes = 0;
4347

@@ -147,14 +151,7 @@ export function typedArrayToBinaryChunk(
147151
// If we passed through this straight to enqueue we wouldn't have to convert it but since
148152
// we need to copy the buffer in that case, we need to convert it to copy it.
149153
// When we copy it into another array using set() it needs to be a Uint8Array.
150-
const buffer = new Uint8Array(
151-
content.buffer,
152-
content.byteOffset,
153-
content.byteLength,
154-
);
155-
// We clone large chunks so that we can transfer them when we write them.
156-
// Others get copied into the target buffer.
157-
return content.byteLength > VIEW_SIZE ? buffer.slice() : buffer;
154+
return new Uint8Array(content.buffer, content.byteOffset, content.byteLength);
158155
}
159156

160157
export function byteLengthOfChunk(chunk: Chunk | PrecomputedChunk): number {

packages/react-server/src/ReactServerStreamConfigNode.js

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,11 @@ export function flushBuffered(destination: Destination) {
3838
}
3939
}
4040

41-
const VIEW_SIZE = 2048;
41+
// Chunks larger than VIEW_SIZE are written directly, without copying into the
42+
// internal view buffer. This must be at least half of Node's internal Buffer
43+
// pool size (8192) to avoid corrupting the pool when using
44+
// renderToReadableStream, which uses a byte stream that detaches ArrayBuffers.
45+
const VIEW_SIZE = 4096;
4246
let currentView = null;
4347
let writtenBytes = 0;
4448
let destinationHasCapacity = true;

0 commit comments

Comments
 (0)