-
Notifications
You must be signed in to change notification settings - Fork 19
Description
Below is just suggestive — I didn't look far enough into it to be sure there's an issue, but it doesn't look very good. I also wasn't able to get the issue in Chrome, so maybe v8 is smarter about this than SpiderMonkey.
Inspired by https://schiener.io/2024-05-29/react-query-leaks, I was curious whether I could get the console to hold onto a bunch of memory. I didn't see an issue with normal browsing, which makes sense because it is mostly about closures containing large request bodies, and our request bodies are tiny.
However, image upload sends the entire contents of an image in chunks, so that seemed like a good candidate. Sure enough, taking snapshots showed memory increasing and not getting GCed, even when I cancel the upload and navigate around for a bit. All the memory is in "strings", which makes sense because we are reading each chunk as a base64ed string.
console/app/forms/image-upload.tsx
Lines 390 to 415 in 13ce748
| const postChunk = async (i: number) => { | |
| const offset = i * CHUNK_SIZE_BYTES | |
| const end = Math.min(offset + CHUNK_SIZE_BYTES, imageFile.size) | |
| const base64EncodedData = await readBlobAsBase64(imageFile.slice(offset, end)) | |
| // Disk space is all zeros by default, so we can skip any chunks that are | |
| // all zeros. It turns out this happens a lot. | |
| if (!isAllZeros(base64EncodedData)) { | |
| await uploadChunk | |
| .mutateAsync({ | |
| path, | |
| body: { offset, base64EncodedData }, | |
| // use both the abort signal for the whole upload and a per-request timeout | |
| signal: anySignal([ | |
| AbortSignal.timeout(30000), | |
| abortController.current?.signal, | |
| ]), | |
| }) | |
| .catch(() => { | |
| // this needs to throw a regular Error or pRetry gets mad | |
| throw Error(`Chunk ${i} (offset ${offset}) failed`) | |
| }) | |
| } | |
| chunksProcessed++ | |
| setUploadProgress(Math.round((100 * chunksProcessed) / nChunks)) | |
| } |
Using Firefox nightly, I was able to get a profile showing a rather huge (I would bet overstated) amount of memory being used during image upload.

