-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use event-stream #110
Comments
I wrote event-stream before @substack convinced everyone that utility libraries are worse than using finer grained module. I suggest using |
@dominictarr good advice. BTW, how does one in map-stream indicate that one has read enough of the input - e.g. imagine i wanted to implement a map that does stuff with first 50 rows and then does not want any more input. Does one throw an exception? (I could ask this more generally about |
ah, you call |
@dominictarr that's what I thought but I couldn't spot |
hmm, oh, I think 0.10 stream have |
@dominictarr sorry for lag here - this has been super-useful and we are most definitely going to use those libraries (btw: huge props for your work here!). Just wanted to confirm: is unpipe equivalent of destroy ? Basis issue for us here is: we want to allow operations on streaming data. Imagine I want to do the following:
Now it is quite likely we may only need to read the first few MBs (or less) of this big file to get the 10 items that head needs. So the head operation (the last item in the chain) once it has got 10 items needs to tell the previous item in the pipeline to stop sending info, and that in turns needs to tell the csv parser and that in turns tells the request to terminate. How can we do that? Previously you called |
Note: we probably also need to use "Object Mode" on our streams: http://nodejs.org/api/stream.html#stream_object_mode |
Oh, sorry - unpipe is not the same as destroy. node streams do not support this use-case very well unfortunately. With node streams you need to hook this up manually, by doing something like: //end or destroy or something?
dest.on('end', function () {
source.end() //or destroy or something.
}) Although, I wish there didn't need to be so many types of streams, also, you can do some other neat stuff like you can define a transform stream out of other transform streams like this: //combine 3 transformations
var combinedTransformation = pull(decode, map, encode)
//then pipe through that transformation.
pull(source, combinedTransformation, sink) You can't really do this in a simple way with node streams... there is https://github.com/dominictarr/stream-combiner/ and https://github.com/naomik/bun hopefully the future platform comes along and makes node.js look like ruby will have streams that work well as both data streams and object streams. |
most core stream2 streams will still have destroy, otherwise they wouldn't be backwards compatible. |
switch to using @dominictarr https://github.com/dominictarr/event-stream much more heavily
The text was updated successfully, but these errors were encountered: