-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Regression in 14.1.0 - Windows] stdout
is sometimes empty
#33166
Comments
cc: @nodejs/streams @ronag |
Can you confirm this happening only on Windows? This is widely unexpected. |
I can't reproduce on OSX. @nicolo-ribaudo any chance you could bisect? I don't have a Windows machine at the moment. |
Yes, I can only reproduce this on windows (not the V8 errors), only the empty |
@ronag I’m not sure… Stack traces on Windows are basically garbage unless you take some extra steps (that I don’t know how to take), and given that the error message isn’t particularly helpful here either – there are over 2000 instances of /cc @nodejs/v8 @nodejs/platform-windows |
Update: this might not be a Node.js bug. Today I made two local clones of the Babel repository: one ~6 hours ago and I can consistently reproduce the errors there, and one ~2 hours ago where I cannot reproduce the errors. This makes me think that maybe it's a bug in a dependency that didn't support Node.js 14.1.0 on windows, and it was recently updated to fix the bug. Even if there was a bug in a dependency I don't think that it should trigger the V8 internal error, but I cannot reproduce that one anyway 🤷 I'll try to re-run the build on Travis CI a bunch of times, and see if I can reproduce it there. |
Can you diff the dependencies? |
Yeah I'm trying. I'm on a Windows VM and I haven't used Windows for a while, so it might take some time 😛 Also, another thing I noticed: in the folder where I can (consistently!) reproduce the problem, I can only reproduce it when using Node.js 14.1.0. If I use a build form |
Yea, a bisect would be very useful. |
I gave up for today, I will continue tomorrow. |
Not sure if it's related, I got Update:I've reduced CI exit test case to some of my source files and babel + 2 babel plugins Update 2:With more debug log found the exit happen before the |
Yeah, Jest is failing often on Node 14.1.0 on Windows with empty stdout in one test or the other almost every test run. The V8 thing is actually in the latest build on master: https://github.com/facebook/jest/runs/634830924 |
Ok, have no idea where to start bisecting. I can still consistently reproduce the failures with the Node.js 14.1.0 installed with the installer, but not if I build 9ffd8e7. |
@nodejs/releasers can you check if there could be any differences between what is available via the installer and what was tagged? |
I think there's no way to be sure it's the promoted one, but I can confirm that the latest release build was done with commit 9ffd8e7 |
@jasnell I know you have a beefy windows machine, could you check this out and run a bisect?
@SimenB This is worrysome. Are you using any wasm inside Jest? |
I'm building the release commit on my machine (Win10, VS2019) |
@mcollina nope, only experimental stuff we ship is the VM ESM APIs, but that code is guarded behind a check that EDIT: That said, some dependency might? Seems unlikely, though - node 14.0 behaved fine, and both mac and linux on 14.1 behaves fine |
Well, we'd need some more simple and reliable way to reproduce this. |
The V8 stacktrace you posted shows a failure within some code in the wasm namespace in V8. Hopefully @targos can shed some light on this. |
Commits between 14.0.0 and 14.1.0: v14.0.0...v14.1.0
|
I'm now trying to remove as much code as possible from my local failing clone of Babel to reduce the number of possible error causes. |
This seems to be not babel specific: https://github.com/pinojs/pino/runs/642073186. Again, this is consistently flaky :(. @targos did you run the tests in Linux WSL or using the normal bin? I have a hunch that Github CI is using WSL. |
I used the normal bin in PowerShell |
Would you mind to try out if it works in WSL? |
@mcollina what exactly do you want me to try? |
The revert will not be backported to v13, right? Only applicable to v14? The only code I've got that uses it is on an unpushed branch on my own machine, so I ask more to manage my own expectations rather than for fear of breakage |
did this only ever reproduce with builds directly from our windows release build servers? |
Also locally: #33166 (comment) |
This reverts commit 74c393d. Fixes: #33166 PR-URL: #33364 Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de> Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com> Reviewed-By: Beth Griggs <Bethany.Griggs@uk.ibm.com> Reviewed-By: Colin Ihrig <cjihrig@gmail.com> Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Michael Dawson <michael_dawson@ca.ibm.com>
Would be nice if someone who was able to reproduce this test with 14.3.0. |
I cannot reproduce the problem with Node v14.3.0. It's fully solved for me. |
I broke my laptop so I cannot test it locally, but as soon as Travis supports Node.js 14.3.0 I can test it on our CI. |
Could #31860 be reopened? Or is the un-reverting tracked somewhere else? |
Can confirm, built 14.3.0 locally, the issue no longer reproduces. |
This still happens using node 14.4.0, fwiw. See https://github.com/facebook/jest/pull/10188/checks?check_run_id=802977045 |
Maybe @orangemocha can help here. |
Looping in @bzoz @joaocgreis @jaimecbernardo |
Sorry, I cannot reproduce with 14.4.0. |
notable change: - break: move `getPackageTgzName` to `@dr-js/node` as `toPackageTgzName` in `module/Software/npm` - break: promote `common/terminalColor`, `node/npm/path`, and `parsePackageNameAndVersion` from `node/npm/npxLazy` to `@dr-js/node` - break: use `getGitBranch|getGitCommitHash` from `@dr-js/node` - break: use `createTest` to init test in `common/test` - break: longer default test timeout - break: use `nodejs@14` instead of `nodejs@13` for CI test - break: use `dr-dev -E` instead of `cross-env` - break: update to `eslint-config-almost-standard-v14` - break: update to `eslint-config-almost-standard-jsx-v8` - break: use `getSourceJsFileListFromPathList` instead of `getScriptFileListFromPathList` from `node/filePreset` - break: use `collectSourceJsRouteMap` instead of `collectSourceRouteMap` from `node/export/parsePreset` - break: pass `pathAutoLicenseFile` instead of `pathLicenseFile` to `initOutput()` from `output` - add: `@dr-js/node` as dependency - add: `@dr-js/dev-eslint` config - add: `clearOutput` to `output` - add: `wrapTestScriptStringToHTML` to `puppeteer` - add: cleaner `isTest` script - add: verify rule - add: `source/node/preset` for keep preset path naming rule in one place - add: support `testUrl` for `testWithPuppeteer` - fix: default `pathInfoFilter` in `collectSourceJsRouteMap` - fix: code lint - fix: build script clean `README.md` file - bin: add: `exec` and `exec-load` mode - ci: mark `windows-latest + 14.x` as unstable. check: nodejs/help#2660 and nodejs/node#33166 - ci: test on all 3 platform - ci: update GitHub CI: https://github.blog/changelog/2020-04-15-github-actions-sets-the-ci-environment-variable-to-true/ - ci: update `INIT#.github#workflows#ci-test.yml` - script sort - package update
notable change: - break: move `getPackageTgzName` to `@dr-js/node` as `toPackageTgzName` in `module/Software/npm` - break: promote `common/terminalColor`, `node/npm/path`, and `parsePackageNameAndVersion` from `node/npm/npxLazy` to `@dr-js/node` - break: use `getGitBranch|getGitCommitHash` from `@dr-js/node` - break: use `createTest` to init test in `common/test` - break: longer default test timeout - break: use `nodejs@14` instead of `nodejs@13` for CI test - break: use `dr-dev -E` instead of `cross-env` - break: update to `eslint-config-almost-standard-v14` - break: update to `eslint-config-almost-standard-jsx-v8` - break: use `getSourceJsFileListFromPathList` instead of `getScriptFileListFromPathList` from `node/filePreset` - break: use `collectSourceJsRouteMap` instead of `collectSourceRouteMap` from `node/export/parsePreset` - break: pass `pathAutoLicenseFile` instead of `pathLicenseFile` to `initOutput()` from `output` - add: `@dr-js/node` as dependency - add: `@dr-js/dev-eslint` config - add: `clearOutput` to `output` - add: `wrapTestScriptStringToHTML` to `puppeteer` - add: cleaner `isTest` script - add: verify rule - add: `source/node/preset` for keep preset path naming rule in one place - add: support `testUrl` for `testWithPuppeteer` - fix: default `pathInfoFilter` in `collectSourceJsRouteMap` - fix: code lint - fix: build script clean `README.md` file - bin: add: `exec` and `exec-load` mode - ci: mark `windows-latest + 14.x` as unstable. check: nodejs/help#2660 and nodejs/node#33166 - ci: test on all 3 platform - ci: update GitHub CI: https://github.blog/changelog/2020-04-15-github-actions-sets-the-ci-environment-variable-to-true/ - ci: update `INIT#.github#workflows#ci-test.yml` - script sort - package update
…tion"" This reverts commit 2d5d773. See: nodejs#32985 See: nodejs#33364 See: nodejs#33166 Fixes: nodejs#31860
notable change: - break: use `webpack@5` - break: use `@dr-js/*@0.4.0-dev.*` - ci: unmark `windows-latest + 14.x` as unstable. issue may be fixed, check: nodejs/help#2660 and nodejs/node#33166 - script sort - package update
…tion"" This reverts commit 2d5d773. See: nodejs#32985 See: nodejs#33364 See: nodejs#33166 Fixes: nodejs#31860
…tion"" This reverts commit 2d5d773. See: nodejs#32985 See: nodejs#33364 See: nodejs#33166 Fixes: nodejs#31860
It seems like this is fixed now. Please reopen if I'm mistaken. |
Bug description
After the Node.js 14.1.0 release, Babel's CI tests on Windows started failing. The failures are all related to
@babel/cli
and@babel/node
: we run those CLI programs, capture their output (fromstdout
) and compare it with the expected output.Sometimes, the generated
stdout
is empty: you can see an example here or here. In this CI log you can also see some error messages coming from V8 internals in@babel/node
tests, but I don't know if it's the same problem.How often does it reproduce? Is there a required condition?
My guess is that every test has about 1% chance of failing. However, we have ~100 tests for those 2 packages so something is failing more often than not.
What steps will reproduce the bug?
I couldn't find a small and isolated reproduction example. I'll keep trying to create one, but here is what I have for now.
Also, building Babel on Windows is painful and I couldn't run the full test suite. However, I managed to reproduce the bug.
I'm running these commands using Powershell, Nodej.s 14.1.0 and Yarn 1.22
If you don't see the last command failing, try running it 2 or three times. The
--runInBand
option isn't necessary to reproduce the problem, but it disables Jest's workers so it removes one possible cause.What is the expected behavior?
Tests should pass
What do you see instead?
An example of output failing locally is this:
Additional information
You can see here how we capture
stdout
for@babel/cli
tests, . For@babel/node
it's the same.@JLHwung suggested that this might be related to stream: don't wait for close on legacy streams #33058 or stream: don't emit end after close #33076
The bug is not present when using Node.js 14.0
The text was updated successfully, but these errors were encountered: