-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
s3 upload - uploads a file with 0 bytes length #1713
Comments
@syberkitten For what it's worth, I'm able to upload objects using node streams with version 2.112.0 of the SDK and 8.4.0 of node.js. I'm also using a bucket in eu-central-1. Here's the code I'm using. I included code to turn on request id logging for any S3 operation as well (it can be difficult to find request ids for operations that didn't error when using s3.upload otherwise) const fs = require('fs');
const path = require('path');
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3({
region: 'eu-central-1'
});
// Add custom request handlers to log out request ids for any S3 client
S3.prototype.customizeRequests(function(request) {
function logRequestIds(response) {
const operation = response.request.operation;
const requestId = response.requestId;
const requestId2 = response.extendedRequestId;
console.log(`${operation}, requestId: ${requestId}, requestId2: ${requestId2}`);
}
request.on('extractData', logRequestIds);
request.on('extractError', logRequestIds);
});
// 3 MB text file
const filePath = path.join(process.cwd(), 'test.txt');
// upload file
s3.upload({
Bucket: 'BUCKET',
Key: 'test.txt',
Body: fs.createReadStream(filePath)
}).promise().then(function(uploadData) {
return s3.headObject({
Bucket: 'BUCKET',
Key: 'test.txt'
}).promise();
}).then(function(headData) {
console.log(headData);
/*
{ AcceptRanges: 'bytes',
LastModified: 2017-09-11T19:25:00.000Z,
ContentLength: 3145728,
ETag: '"8f9abcae94ab69f39fc6935b89dabefc"',
ContentType: 'application/octet-stream',
Metadata: {} }
*/
}).catch(function(err) {
console.error(err);
}); |
Thanks for the lead, integrated the headObject so here is the response for upload:
Using Node Version 7.10.0 This is a sample of the code, I'm using TypeScript so it's a blend
|
Upgraded to Node 8.4.0 Stable, still same result. The only problem with putObject for me is the timeout it seems to have Getting:
this issue: |
I have a version somewhere which uses multipart upload |
I'm second this. It happens to us occassionally, I simply pass the ReadStream to putObject, response is fine, file is there at our end, I can check the size etc, but successful upload to S3 often results in zero bytes object. |
@vnenkpet I was able to upload files up to 1gb of size with upload times of 1.5 hours no need to re-invent the wheel, we should use battle prone software here's a code sample:
|
I'm running into this issue as well. Has any headway been made towards resolving it? I'm using aws-sdk 2.176.0. I'd love to not have to switch to a 3rd party library |
I was having a similar problem when passing a http.IncomingMessage stream to s3.upload. If I tried to measure the size of the file while uploading I would always get the 0 byte object. However, if I added a pause to the stream before the upload it all worked as intended. I'm not sure this is exactly your problem, but hopefully it helps. I think in general it might be a problem with adding event listeners to a stream before passing it to s3.
|
it would be nice to add this to the docs to save other devs from wasting hours / days req.pause(); kudos to @notoriaga for the fix |
@vnenkpet Have you solved the issue? |
I have this same issue. Response is fine. But successful upload to s3 causes a 0 byte file. And the behaviour is very inconsistent. |
I use PassThrough solution that works just fine.
|
this does not fix the issue, I still get 0 byte uploads with filestream pause. Even if it did, this is still a hack where the actual lib needs fixing. Reliability is key |
It is an angular service worker issue. When enabled it causes uploading 0kb in safari. People using firebase uploads also are having the same issue when angular service worker is enabled and in safari. See below: Seems like the workaround is to bypass the service worker when doing the S3 call. There is a feature here below to bypass: https://angular.io/guide/service-worker-devops
Implementation of bypass here in angular: I tried doing the following in aws call:
Still getting the issue of uploading okb ie not being bypassed. Any suggestion on how I can have implement the bypass feature for AWS s3 upload calls? |
Ok found the solution. As my previous comment angular service worker is causing the issue. There is an option to bypass service by adding 'ngsw-bypass'. However I did not find a way to add custom headers to S3.manageUploads ('build' method only applies to putObject). Therefore only work around is to manually add the following code AFTER build to the ngsw-worker.js file. Under onFetch(event) function add the following after const declarations: if (req.method==="PUT" && requestUrlObj.origin.indexOf('amazonaws.com')>-1){return;} It should look like this:
Alternatively you can also add the following INSTEAD: For my project there is no need for PUT methods to go through service workers hence I am using the latter code as this also should avoid other issues such as video/audio not properly playing in Safari. More information on this see below: If anyone knows how to automate this during the build please let me know. but at least it is a solution now. |
Hello people; I have a similar issue when trying to post a file into JIRA instance. Here is my code: var options = { request(options, function (error, response) { |
Greetings! We’re closing this issue because it has been open a long time and hasn’t been updated in a while and may not be getting the attention it deserves. We encourage you to check if this is still an issue in the latest release and if you find that this is still a problem, please feel free to comment or open a new issue. |
In case somebody is looking for a Python solution, you need to data = open('local_file.txt', "rb")
size = len(data.read())
data.seek(0)
boto3_client.upload_fileobj(data, bucket_name, destination_key) |
The response is fine, but the actual file in bucket is 0 bytes length.
it's an application/octet-stream using a nodejs createReadStream instance.
I'm currently using putObject which has other problems
such as timeout when uploading more then 500kb
and not Location property in response (so I have to create the location myself)
-- region: eu-central-1
-- bucket: EU frankfurt
The text was updated successfully, but these errors were encountered: