-
Notifications
You must be signed in to change notification settings - Fork 248
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EntityTooLarge: fail to cp file with 6.3GB from one bucket to another bucket #29
Comments
I have also just encountered this issue, has something changed? |
Thanks for the report. We're using the official AWS SDK. Copying objects (from S3 to S3) larger than https://docs.aws.amazon.com/AmazonS3/latest/API/API_CopyObject.html
We're going to use the multipart Copy API to support large file transfers between S3 buckets. Please note that uploading a large file to S3 works as expected. |
aws/aws-sdk-go#2653 PR is required to address this issue. |
Just checking if there is any news on this issues. I could make use of this feature in my workflow. Not sure that PR @igungor mentioned is merged/ what's stopping that merge. |
Is there any update on the status of this issue? |
Similar problem is with |
case: s3cmd sync source/* dest/ "sync" fails silently on the first run after copying some objects. |
actually, bring us to the question - how do we exclude files from cp/sync ? |
some other go libs support this |
Any updates? |
Ran into this today - any updates on this or is there a work-around ? |
"InvalidRequest: The specified copy source is larger than the maximum allowable size for a copy source: 5368709120 status code: 400" this is the error in v2.2.2 |
fail to cp file with 6.3GB from one bucket to another bucket
EntityTooLarge: Your proposed upload exceeds the maximum allowed object size
The text was updated successfully, but these errors were encountered: