Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unhandled error FILE_ENDED from PullStream.js #48

Closed
ianbale opened this issue Oct 6, 2017 · 3 comments
Closed

unhandled error FILE_ENDED from PullStream.js #48

ianbale opened this issue Oct 6, 2017 · 3 comments
Assignees

Comments

@ianbale
Copy link

ianbale commented Oct 6, 2017

My app has an SFTP service which receives a ZIP file. Upon completion of the upload I fire off a task which unzips the file. Last night the upload was aborted part way through. This resulted in the node process terminating due to an unhandled error.

I am going to look at the SFTP handler to see if I can catch the aborted upload at that point and avoid any attempt at doing the unzip. However, it seems that unzipper should not cause a server crash due to a truncated zip file.

Having had a quick look, it seems to me that PullSteam.js is detecting the unexpected end of file and emitting an error which is not handled in the calling code. My error dump as as follows:

server stopping { Error: Unhandled "error" event. (FILE_ENDED) at PullStream.emit (events.js:185:19) at PullStream.onerror (_stream_readable.js:652:12) at emitOne (events.js:115:13) at PullStream.emit (events.js:210:7) at PullStream.pull (/Users/ianbale/Source Code/noderail-data-feed-manager/node_modules/unzipper/lib/PullStream.js:66:14) at emitOne (events.js:115:13) at PullStream.emit (events.js:210:7) at PullStream.<anonymous> (/Users/ianbale/Source Code/noderail-data-feed-manager/node_modules/unzipper/lib/PullStream.js:19:10) at emitNone (events.js:110:20) at PullStream.emit (events.js:207:7) at finishMaybe (_stream_writable.js:587:14) at endWritable (_stream_writable.js:595:3) at PullStream.Writable.end (_stream_writable.js:546:5) at ReadStream.onend (_stream_readable.js:584:10) at Object.onceWrapper (events.js:314:30) at emitNone (events.js:110:20) at ReadStream.emit (events.js:207:7) context: 'FILE_ENDED' } server stopping TypeError: cb is not a function at afterWrite (_stream_writable.js:438:3) at onwrite (_stream_writable.js:429:7) at /Users/ianbale/Source Code/noderail-data-feed-manager/node_modules/unzipper/lib/PullStream.js:59:60 at afterWrite (_stream_writable.js:438:3) at /Users/ianbale/Source Code/noderail-data-feed-manager/node_modules/async-listener/glue.js:188:31 at _combinedTickCallback (internal/process/next_tick.js:144:20) at Immediate._tickCallback (internal/process/next_tick.js:180:9) at Immediate._onImmediate (/Users/ianbale/Source Code/noderail-data-feed-manager/node_modules/async-listener/glue.js:188:31) at runCallback (timers.js:781:20) at tryOnImmediate (timers.js:743:5) at processImmediate [as _immediateCallback] (timers.js:714:5) DataManagerFeedProducer closed... kafka dataFeedClient closed... LogsProducer closed... kafka LogsClient closed... opsProducer closed... kafka opsClient closed... server stopping 1

@ZJONSSON ZJONSSON self-assigned this Oct 7, 2017
@ZJONSSON
Copy link
Owner

Thanks - errors in streams are always a bit annoying (since they don't propagate like promises). You need to put an error handler to prevent the process from terminating.

Was this a valid zip file? If so we would need to investigate further - technically the FILE_ENDED means that you wanted to pull more data but you reached the end of the file

@camlegleiter
Copy link

I'm doing something similar and see the same error when trying to extract from a ZIP containing large (> 4 GB) files. Testing against smaller files works fine. I'm able to unzip these larger ZIPs without error; I don't think it's a corruption issue, perhaps something related to ZIP64. Does node-unzipper support ZIP64?

HttpError: FILE_ENDED
    at Parse.<anonymous> (.../controller.js:185:21)
    at emitOne (events.js:96:13)
    at Parse.emit (events.js:188:7)
    at Parse.pull (.../node_modules/unzipper/lib/PullStream.js:66:14)
    at emitOne (events.js:96:13)\n    at Parse.emit (events.js:188:7)
    at Parse.<anonymous> (.../node_modules/unzipper/lib/PullStream.js:19:10)
    at emitNone (events.js:91:20)\n    at Parse.emit (events.js:185:7)
    at finishMaybe (_stream_writable.js:515:14)\n    at afterWrite (_stream_writable.js:388:3)
    at onwrite (_stream_writable.js:378:7)\n    at Parse.WritableState.onwrite [as cb] (_stream_writable.js:90:5)
    at .../node_modules/unzipper/lib/PullStream.js:59:60
    at afterWrite (_stream_writable.js:387:3)\n    at onwrite (_stream_writable.js:378:7)
    at WritableState.onwrite (_stream_writable.js:90:5)
    at afterTransform (_stream_transform.js:79:3)\n    at TransformState.afterTransform (_stream_transform.js:54:12)
    at PassThrough._transform (_stream_passthrough.js:21:3)
    at PassThrough.Transform._read (_stream_transform.js:167:10)
    at PassThrough.Readable.read (_stream_readable.js:348:10)

@drmrbrewer
Copy link

drmrbrewer commented Oct 21, 2020

@ZJONSSON I'm getting quite a few FILE_ENDED errors like this (fetching big-ish file from remote url). It only seems to happen when the node on which the instance is running is also running lots of other resource-intensive processes. Once those processes stop, this seems to leave room for node-unzipper to run again without error. Of course, one solution is just to throw more resources (CPU, RAM) at it, but I was wondering if there is any way of tuning node-unzipper in any way to be more resilient under stress?

EDIT update in case it helps others... eventually I worked out that the problem is caused by a long-running processing operation (on the unzipped file), which takes longer when the node is busy with other stuff), with a data pipe still open all the way back to the remote source... perhaps some sort of timeout occurs and this breaks the pipe before the data has been fully processed. The solution is to break and restart the pipe after the remote fetch... i.e. just pipe the remote data (still zipped) into a local file and then stream from the local file into unzipper.Parse() instead... that way, if the downstream process takes a long time, it doesn't matter... now works just fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants