-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stdout maxBuffer length exceeded #66
Comments
Can you provide more details? Your full s3p command-line with all the options would help a great deal tracking this down. Also, if you can, share your |
same problem here
|
Can you provide more details? Your full s3p command-line with all the options would help a great deal tracking this down. |
A little bit of research. It seems it comes from Node's Exec function, which I use to copy larger files using aws's cli. The default is 1 megabyte, which surprises me that isn't enough. Does this happen every time? It sounds more like there was an error that wasn't properly reported. If it does happen every time, I can try upping the maxBuffer setting and then you can try it and let me know if that helped. |
I ran into this as well. Command:
the files to copy are about 10GiB each (slightly over 200 files). this issue did not happen when I synced the bucket from one region to another. edit: thank you so much for this tool. cross-region sync was massively sped up by s3p! |
Error:
class: class RangeError
stack:
RangeError [ERR_CHILD_PROCESS_STDIO_MAXBUFFER]: stdout maxBuffer length exceeded
at new NodeError (node:internal/errors:372:5)
at Socket.onChildStdout (node:child_process:461:14)
at Socket.emit (node:events:527:28)
at Socket.emit (node:domain:475:12)
at addChunk (node:internal/streams/readable:315:12)
at readableAddChunk (node:internal/streams/readable:285:11)
at Socket.Readable.push (node:internal/streams/readable:228:10)
at Pipe.onStreamRead (node:internal/stream_base_commons:190:23)
The text was updated successfully, but these errors were encountered: