-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fcntl: too many open files #52
Comments
Hey @tommyblue, thanks for filing an issue.
In addition to that, could you share the output of
Increasing the limit to a high value should work. |
|
Your diagnosis and proposed solution are certainly correct, but I think the behaviour of s5cmd under this situation is misleading. Moreover the error message is tagged "VERBOSE" so it's probably hidden without the Some possible solutions:
|
@tommyblue You're right. The first one is definitely needed. If there is a non-retriable error, it needs to be shown without a verbose flag. Others are good suggestions also. Let's keep the issue open till we've a fix. Thanks again. |
It tries to increase the soft limit of open files to avoid hitting the limit by the OS. It also catches too many open files errors and types a warning that includes a suggestion about -numworkers argument. Then exits instantly. Fixes #52
I'm on MacOS 10.15.3 and I'm trying to upload to S3 a folder that contains 2616 folders with 1 to 10 files each.
With
s5cmd -stats -r 0 -vv cp -n --parents <src> <dest>
I immediately see this error:but the uploads seem to proceed, though the uploaded files at the end are generally less than 100.
Stats output:
If I run the same command with the
-numworkers 16
option, the copy ends without errors and all files are correctly uploaded to S3The text was updated successfully, but these errors were encountered: