Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream not emitting data event after pipe+unpipe #1041

Closed
dmail opened this issue Mar 3, 2015 · 18 comments
Closed

Stream not emitting data event after pipe+unpipe #1041

dmail opened this issue Mar 3, 2015 · 18 comments
Labels
doc Issues and PRs related to the documentations. stream Issues and PRs related to the stream subsystem.

Comments

@dmail
Copy link

dmail commented Mar 3, 2015

The following should log 'ok' to the console but nothing is logged.

var stream = require('stream');
var pass = new stream.PassThrough();
var writable = new stream.Writable();

pass.pipe(writable); pass.unpipe(writable);
pass.on('data', function(chunk){ console.log(chunk.toString()); });
pass.write('ok');

Of course, commenting the line pass.pipe(writable); pass.unpipe(writable); fix the problem.

Can you explain me why?

@micnic
Copy link
Contributor

micnic commented Mar 3, 2015

you should get the error "chunk is not defined" on line 6, when fixed it should print "ok" on the stdout.

@micnic micnic closed this as completed Mar 3, 2015
@dmail
Copy link
Author

dmail commented Mar 3, 2015

I'm sorry @micnic I forgot the chunk argument when I wrote this snipet I've modified the line.
When I exec this script nothing is logged to the console, but when I comment the pipe/unpipe line, ok is logged.

var stream = require('stream');
var pass = new stream.PassThrough();
var writable = new stream.Writable();

// When the line below is commented, ok is logged
//pass.pipe(writable); pass.unpipe(writable); 
pass.on('data', function(chunk){ console.log(chunk.toString()); });
pass.write('ok');

@micnic micnic reopened this Mar 3, 2015
@micnic
Copy link
Contributor

micnic commented Mar 3, 2015

pass gets paused and it needs to be resumed with pass.resume() before writing. I'm not sure, but this could be a bug, in node 0.10 it works as you expect, in 0.12 and io.js it needs to be resumed.

@dmail
Copy link
Author

dmail commented Mar 3, 2015

You're right !

With 0.10 no need to call resume()
With 0.12 I have to call resume() to get the 'ok' log

var stream = require('stream');
var pass = new stream.PassThrough();
var writable = new stream.Writable();

pass.pipe(writable); pass.unpipe(writable); 
pass.resume();
pass.on('data', function(chunk){ console.log(chunk.toString()); });
pass.write('ok');

Thank you very much :)

@chrisdickinson
Copy link
Contributor

.unpipe sets state.flowing = false, which is the same as explicitly .pause()'ing the stream. Since it's in an explicitly paused state, .on('data') does not implicitly resume the stream. This seems like a regression. cc @iojs/streams

@micnic micnic added confirmed-bug Issues with confirmed bugs. stream Issues and PRs related to the stream subsystem. labels Mar 5, 2015
@micnic
Copy link
Contributor

micnic commented Mar 10, 2015

ping @iojs/streams

@sonewman
Copy link
Contributor

@chrisdickinson This "feature" seems to have been introduced here 444bbd4

streams: Support objects other than Buffers

...
Fixed a bug with unpipe where the pipe would break because
the flowing state was not reset to false.

I believe this was during the streams2 era. So in that case I think we weren't expecting to get any data events.

We'll have to see what the implications are to streams with only a readable listener if we remove this.

@Trott
Copy link
Member

Trott commented Feb 4, 2016

@sonewman @chrisdickinson @nodejs/streams I don't suppose there's anything new to add to this issue at this time, is there?

@iliakan
Copy link

iliakan commented Feb 9, 2016

That's a bug indeed, tracked down in the following:

var fs = require('fs');

var src = fs.createReadStream(__filename);
var dst = fs.createWriteStream('/tmp/test');

src.pipe(dst);
src.unpipe(dst);
src.on('data', function() {
    console.log("Never happens")
  });

As @chrisdickinson wrote above, unpipe sets the stream to "explicitly paused" state, then on('data') does not resume it.

P.S. And yes, extra resume call at the end makes it flow again.

@Fishrock123 Fishrock123 added doc Issues and PRs related to the documentations. and removed confirmed-bug Issues with confirmed bugs. labels Dec 8, 2016
@Fishrock123
Copy link
Contributor

This sounds like it may just be a docs issue?

@iliakan
Copy link

iliakan commented Dec 9, 2016

yes, it's a docs issue, there's another one I remember, explicitly citing docs. Maybe close this one.

@iliakan
Copy link

iliakan commented Dec 9, 2016

at least, it looks like the code won't be fixed, so it becomes a docs issue ;)

@jasnell
Copy link
Member

jasnell commented May 30, 2017

ping @nodejs/streams

@mcollina
Copy link
Member

It is a doc issues, specifically we need to update https://nodejs.org/api/stream.html#stream_three_states so that it clearly states things.

mcollina added a commit to mcollina/node that referenced this issue May 31, 2017
Clarifies the behavior of streams when _readableState.flowing is
false. resume() must be called explicitly for the 'data' event to
be emitted again.

Fixes: nodejs#1041
@mcollina
Copy link
Member

mcollina commented Jun 5, 2017

Fixed in 0b432e0.

@mcollina mcollina closed this as completed Jun 5, 2017
mcollina added a commit that referenced this issue Jun 5, 2017
Clarifies the behavior of streams when _readableState.flowing is
false. resume() must be called explicitly for the 'data' event to
be emitted again.

Fixes: #1041
PR-URL: #13329
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Rich Trott <rtrott@gmail.com>
jasnell pushed a commit that referenced this issue Jun 5, 2017
Clarifies the behavior of streams when _readableState.flowing is
false. resume() must be called explicitly for the 'data' event to
be emitted again.

Fixes: #1041
PR-URL: #13329
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Rich Trott <rtrott@gmail.com>
MylesBorins pushed a commit that referenced this issue Jul 17, 2017
Clarifies the behavior of streams when _readableState.flowing is
false. resume() must be called explicitly for the 'data' event to
be emitted again.

Fixes: #1041
PR-URL: #13329
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Rich Trott <rtrott@gmail.com>
@mpotra
Copy link
Contributor

mpotra commented Oct 15, 2017

Actually, this still breaks any data even listeners added before any pipe() / unpipe(), which I believe is a serious behavior flaw introduced in Readable by 444bbd4

Consider the following:

const readable; // A Readable stream
const writable; // A Writable stream

readable.on('data', function onData(chunk) {
  // This will only execute until unpipe();
});

readable.pipe(writable);
readable.unpipe(writable);

Given it unexpectedly pauses the stream, even while there are data listeners attached, this cannot be just a docs issue.

In scenarios where there are two independent algorithms consuming the stream - the first using data listeners and the second using pipe - it is unfeasible for the first algorithm to monitor any following pipe/unpipe (events are on writable anyway)


For the scenario I mentioned here, I've already created a quick fix (and associated test), but I'm waiting for your thoughts before turning it into a PR.

The fix simply checks if there are any remaining data listeners upon cleanup() after calling unpipe(); if there are, it keeps the stream in flowing mode.


A similar argument could be made for attaching data listeners after unpipe(), but that involves a slightly different fix, which requires only setting state.flowing to null instead of false (here and here) in the unpipe() function.
That would ensure the stream is in paused mode after unpipe() and no data listeners, but that it resumes flow whenever there's a new consumer attached.
I'd still need to run tests for this, and it might also mean reverting the docs update

@mcollina
Copy link
Member

Given it unexpectedly pauses the stream, even while there are data listeners attached, this cannot be just a docs issue.

It is a doc issue compared to the current behavior: streams currently behave in this way, and it is expected for them to keep behaving in this manner as the commit you are referencing is 5 years old. You are proposing a change to this behavior which will be semver-major and I expect it to be a non-trivial change, considering possible breakages in userland.
If you want to send a PR, please do!

@mpotra
Copy link
Contributor

mpotra commented Oct 19, 2017

@mcollina many thanks for clarifying why it's been a doc issue.
However, I still believe that the current behavior should be reverted, since it breaks interoperability between the two modes of consuming a stream. I agree, this might indeed require a bit more work, and some more tests, and I don't expect it to land anytime soon.
I'll send over a PR, once I get to properly refactor some bits, and hopefully that'll start a discussion about adjusting the behavior at least, in some future version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
doc Issues and PRs related to the documentations. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

No branches or pull requests

10 participants