Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Invalid state: ReadableStream is locked error from fetch-mock router when combining AbortSignal.timeout and recursive retries #845

Open
laurieboyes opened this issue Oct 3, 2024 · 9 comments
Labels

Comments

@laurieboyes
Copy link

laurieboyes commented Oct 3, 2024

Hello old friend 👋 Struggling to say for sure if this a user error or if I’ve stumbled upon an edge case.

I’m writing tests in jest, mocking globally, using node 18, and node native fetch.

Scenario:

  • First test is for retrying after AbortSignal.timeout expires (no errors from this running on its own)
  • Following test makes a second fetch and an error is logged
node:internal/webstreams/readablestream:323
       new ERR_INVALID_STATE.TypeError('ReadableStream is locked'));
       ^

TypeError: Invalid state: ReadableStream is locked
   at ReadableStream.cancel (node:internal/webstreams/readablestream:323:9)
   at AbortSignal.abort (/Users/Laurie.Boyes/dev/fetch-mock/packages/core/dist/cjs/Router.js:95:47)
   at AbortSignal.[nodejs.internal.kHybridDispatch] (node:internal/event_target:816:20)
   at AbortSignal.dispatchEvent (node:internal/event_target:751:26)
   at abortSignal (node:internal/abort_controller:374:10)
   at Timeout._onTimeout (node:internal/abort_controller:128:7)
   at listOnTimeout (node:internal/timers:573:17)
   at processTimers (node:internal/timers:514:7) {
 code: 'ERR_INVALID_STATE'
}

Please see the test case in this PR: #844. Let me know if there’s any other info I can provide.

I've been working around this issue by explicitly creating an AbortController and setting a timeout to fire the signal and manually cancelling it after the promise resolves

@wheresrhys
Copy link
Owner

Hello 👋

Just getting round to looking at this now.

Which version of jest are you on? I'm not able to get the test to fail, but I do keep getting flaky tests relating to abort in CI which have been around for ages and I'm not too sure why

@wheresrhys
Copy link
Owner

@laurieboyes
Copy link
Author

laurieboyes commented Oct 21, 2024

Hey, thanks for taking a look! Sorry I should have said, I haven’t managed to create a minimal case for this causing an actual test failure either, I just get the error logged: https://app.circleci.com/pipelines/github/wheresrhys/fetch-mock/1061/workflows/9adf68aa-c9f8-4964-81af-94c738765f31/jobs/11125?invite=true#step-102-1871_50

In my own test suite, for reasons I haven’t yet managed to establish, this error does seem to cause a failure.

I was hoping (and it’s a reasonably confident hope) that if we can get to the bottom where this error is logged from, this will fix my failing test in my own codebase

@wheresrhys
Copy link
Owner

Reading here https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream/locked I don't understand much more

But I have a hunch that [your .json()](https://github.com/wheresrhys/fetch-mock/compare/main...laurieboyes:fetch-mock:main#diff-ee4acfc8f5f94532279e0cfad10e16ac93d2db84a6e9c9404510acc9b6cc452eR15) reads and locks the body stream, which then causes the error because the streanm is already being read before the abort tries to cancel the body. I suspect the thing to do is to add an extra check for locked === true` around here, but I feel I should read up a bit more on what the specs say about the behaviour. I think probably abort shodul jsut take control of the stream and cancel it no matter what else is going on, but not sure

@wheresrhys
Copy link
Owner

Could you check out this #847, run npm build and then manually copy the files into your node_modules to see if the fix works (I'd offer to publish a beta but my build pipeline's handling of beta releases is a bit broken at the moment)

@laurieboyes
Copy link
Author

Hm nope I get the same error, this time from the call to callLog.response.body.getReader()

@RubenAWL
Copy link

RubenAWL commented Nov 19, 2024

Could you check out this #847, run npm build and then manually copy the files into your node_modules to see if the fix works (I'd offer to publish a beta but my build pipeline's handling of beta releases is a bit broken at the moment)

These changes actually do fix the issue for me. I'm using;
"vitest": "2.1.5",
"@fetch-mock/vitest": "0.2.4",
"@testing-library/dom": "10.4.0",
"@testing-library/jest-dom": "6.6.3",
"@testing-library/react": "16.0.1",

could it be that this only occurs for tests which use an abortController?

@wheresrhys
Copy link
Owner

@RubenAWL What issue were you seeing exactly as I thought this discussion was only about cases when abort was used? Is there some other issue with readable streams that occur in a different scenario. If there is, would you mind opening another ticket with a description (and ideally a failing test or other runable example )

@RubenAWL
Copy link

RubenAWL commented Nov 19, 2024

@RubenAWL What issue were you seeing exactly as I thought this discussion was only about cases when abort was used? Is there some other issue with readable streams that occur in a different scenario. If there is, would you mind opening another ticket with a description (and ideally a failing test or other runable example )

Thank you for your quick response, this case is also happening when the abort controller we use, aborts a fetch.

Without the suggested fix, I encounter the following error in a lot of our test files. This error looks quite similar to the error mentioned above, only the backtrace is a bit different.

⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ Unhandled Rejection ⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
TypeError: Invalid state: ReadableStream is locked
 ❯ new NodeError node:internal/errors:405:5
 ❯ ReadableStream.cancel node:internal/webstreams/readablestream:331:9
 ❯ AbortSignal.abort node_modules/fetch-mock/dist/esm/Router.js:98:30
 ❯ AbortSignal.callTheUserObjectsOperation node_modules/jsdom/lib/jsdom/living/generated/EventListener.js:26:30
 ❯ innerInvokeEventListeners node_modules/jsdom/lib/jsdom/living/events/EventTarget-impl.js:350:25
 ❯ invokeEventListeners node_modules/jsdom/lib/jsdom/living/events/EventTarget-impl.js:286:3
 ❯ AbortSignalImpl._dispatch node_modules/jsdom/lib/jsdom/living/events/EventTarget-impl.js:233:9
 ❯ fireAnEvent node_modules/jsdom/lib/jsdom/living/helpers/events.js:18:36
 ❯ AbortSignalImpl._signalAbort node_modules/jsdom/lib/jsdom/living/aborting/AbortSignal-impl.js:65:5
 ❯ AbortControllerImpl.abort node_modules/jsdom/lib/jsdom/living/aborting/AbortController-impl.js:11:17

⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Serialized Error: { code: 'ERR_INVALID_STATE' }
This error originated in "src/x/y/z.spec.js" test file. It doesn't mean the error was thrown inside the file itself, but while it was running.
The latest test that might've caused the error is "Inserts copied values as a new .... (POST)". It might mean one of the following:
- The error was thrown, while Vitest was running this test.
- If the error occurred after the test had been completed, this was the last documented test before it was thrown. 

Our project is using React components in which have an useAbortController hook . This hook always aborts when a component is unmounted, the fetch might have already finished or still be going and needing to be aborted.
We believe this issue comes up whenever the fetch succeeded already (mocked in the unit tests), a body-reader is already attached and abort is then called due to unmount in the test cleanup. As this happends on every cleanup for tests that use fetchmock, we have about 577 unhandled errors.

For context, we are migrating an existing application (incl tests) from "fetch-mock": "11.1.5" with jest to vitest with fetch-mock/vitest: v0.2.6 (fetch-mock 12.2.0) and did not have these issues before in the tests.

@wheresrhys wheresrhys added the bug label Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants