Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

child-process: data loss with piped stdout #7184

Closed
petrosagg opened this issue Jun 7, 2016 · 10 comments
Closed

child-process: data loss with piped stdout #7184

petrosagg opened this issue Jun 7, 2016 · 10 comments
Labels
child_process Issues and PRs related to the child_process subsystem. duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem.

Comments

@petrosagg
Copy link

  • Version: v6.2.1
  • Platform: Linux rachmaninoff 4.5.3-1-ARCH deps: update openssl to 1.0.1j #1 SMP PREEMPT Sat May 7 20:43:57 CEST 2016 x86_64 GNU/Linux
  • Subsystem: child-process

When the stdout of a spawned process is piped into a writable stream that doesn't read fast enough, and the spawned process exits, node will put it into flowing mode and data will be lost.

I have the following minimal testcase that pipes a simple child process to a PassThrough stream, and only after the process has exited the through stream is piped to node's stdout.

  • Expected output: All numbers from 1 to 36000
  • Actual output: On my machine it stops at around 32000
const spawn = require('child_process').spawn;
const stream = require('stream');

const through = new stream.PassThrough();

const p = spawn('seq', [ '36000' ]);

p.on('exit', function(code) {
    setImmediate(function() {
        through.pipe(process.stdout);
    });
});

p.stdout.pipe(through);
@addaleax addaleax added duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem. labels Jun 7, 2016
petrosagg added a commit to balena-io-experimental/node that referenced this issue Jun 7, 2016
This commit prevents child process stdio streams from being
automatically flushed on child process exit/close if a piped consumer
has been attached at the time of exit.

Without this, child process stdio data can be lost if the process
exits quickly and the piped consumer hasn't had the chance to read the
remaining data.

Fixes: nodejs#7184
Signed-off-by: Petros Angelatos <petrosagg@gmail.com>
@evanlucas
Copy link
Contributor

Can you try using the close event instead of exit?

@addaleax
Copy link
Member

addaleax commented Jun 7, 2016

Sounds a lot like a duplicate of #7159, which has a fix pending in #7160

@petrosagg
Copy link
Author

petrosagg commented Jun 7, 2016

@evanlucas the close event is never emitted.

@addaleax I think it's a different issue. I just pushed a PR and testcase that fixes the issue for me

@addaleax
Copy link
Member

addaleax commented Jun 7, 2016

@petrosagg Yeah, I see that, and I also understand why your PR fixes the issue, but I still think the underlying issue stems from the streams implementation, not from child processes. You can through the description of #7160 if you want to understand why the resume() call is causing problems.

@petrosagg
Copy link
Author

@addaleax I just tested #7160 with my testcase and it indeed fixes the issue. Does .resume() respect piped streams? The docs say the readable stream will start emitting data events when .resume() is called so I assumed that data can still be potenialy lost.

@addaleax
Copy link
Member

addaleax commented Jun 7, 2016

@petrosagg .pipe() essentially works by adding data listeners to the source stream, so I think that should be fine… or am I misunderstanding what you’re saying?

@petrosagg
Copy link
Author

petrosagg commented Jun 7, 2016

@addaleax I'm trying to understand how back-pressure works when one calls resume(). From what you're saying it sounds like that if you have a.pipe(b) and then call a.resume() and b can't have more data written to its buffers then b's data handler will immediately re-pause a. Is this correct?

@addaleax
Copy link
Member

addaleax commented Jun 7, 2016

@petrosagg Well, I think that’s what supposed to happen, and #7160 would ensure that behaviour.

@addaleax addaleax added the child_process Issues and PRs related to the child_process subsystem. label Jun 7, 2016
@addaleax
Copy link
Member

addaleax commented Jun 7, 2016

Closing as a duplicate but feel free to keep asking if there’s anything unclear :)

@addaleax addaleax closed this as completed Jun 7, 2016
@petrosagg
Copy link
Author

@addaleax I was studying your testcase. Everything makes sense now. Thanks for the explanation :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
child_process Issues and PRs related to the child_process subsystem. duplicate Issues and PRs that are duplicates of other issues or PRs. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants