Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chuked file on web api #454

Open
akshaybogar1984 opened this issue Jul 10, 2024 · 35 comments
Open

chuked file on web api #454

akshaybogar1984 opened this issue Jul 10, 2024 · 35 comments

Comments

@akshaybogar1984
Copy link

HI, I am getting this issue as I unable to post the document on web api. please help me
I can see the payload but where I can get the chuked file ?

payload:
{"name":"10_MB.MP4","mimeType":"video/mp4","size":10485760,"lastModified":1720538165694}

endpoint: @https://localhost:7177/api/WeatherForecast/Upload1?uploadType=uploadx`

web api
issue

@kukhariev
Copy link
Owner

The backend must respond to the POST request by specifying the chunks upload url in the Location header.
Then ngx-uploadx will send file chunks to this url.

https://github.com/kukhariev/node-uploadx/blob/master/proto.md#requests-overview
https://developers.google.com/drive/api/guides/manage-uploads?#resumable

@akshaybogar1984
Copy link
Author

My requirement is, upload the file in chunks(20gb) to blob storage.
It can be direct to blob storage to via web api using c#.
Could you please the code c# code as web server

@akshaybogar1984
Copy link
Author

could you help me to work this on my machine ?

@kukhariev
Copy link
Owner

You can add uploaderClass: Tus to options

@Component({
  selector: 'app-upload-component',
  standalone: true,
  imports: [RouterOutlet, UploadxDirective],
  template: `
    <input type="file" [uploadx]="options" (state)="onUpload($event)" />
  `
})
export class AppUploadComponent {
  options: UploadxOptions = {
    endpoint: '[URL]',
    uploaderClass: Tus
  };
  onUpload(state: UploadState) {
    console.log(state);
  }
}

and use https://github.com/tusdotnet/tusdotnet

@akshaybogar1984
Copy link
Author

Thanks. But using front end as Angular and Backend as web api c#. So I would like to know when agnular pushes chunks to server how web api would recieved chunks in order to upload to blobl storage ?

@akshaybogar1984
Copy link
Author

Untitled

Could you please let us know where to set the Location header in angular to call the serveice?

@kukhariev
Copy link
Owner

Sorry, i have almost no knowledge of either C# or blob storage and don't have access to Azure :(

While I'm on vacation, I'm going to try to work towards doing blob storage support.

Is there currently a working server or
any description of how it works?

For example:

[HttpPost]
public async Upload
{
  // validation
  // generate uid from JSON paylod metadata
  // construct the UploadChunk URL from uid
  // store metadata
  // create a new blob
  // send  UploadChunk Url in Location header
  //  return 201 Created
}

[HttpPut('{uid}')]
public async UploadChunk(string uid)
{
  // validation
  // Append chunk to Blob Storage
  // update metadata
  // send the range header and status 308 or 200 if it's the last chunk of the file.

}

@akshaybogar1984
Copy link
Author

I have server(web api c#) but unable to upload using ngx-uploadx.

@kukhariev
Copy link
Owner

@akshaybogar1984 ,
Here's a working example with direct upload from browser to blob storage. Url generation and blob creation in getFileUrl can be moved to server.

import { Uploader } from 'ngx-uploadx';

const env = {
  sasToken:
    'sv=2023-08-03&ss=bfqt&srt=sco&se=2030-12-12T08%3A11%3A41Z&sp=rwdlacup&sig=%2F9bU8%2FkdnDe2bVMRSVZWGJc5QlR58nvCEeCEfOXzjA0%3D',
  containerURL: 'http://127.0.0.1:10000/devstoreaccount1/container1'
};

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     chunksize: 4 * 1024 * 1024,
 *     endpoint: `[containerURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  override async getFileUrl(): Promise<string> {
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate(),
      'x-ms-blob-type': 'AppendBlob'
    };
    const url = `${env.containerURL}/${this.file.name}?${env.sasToken}`;
    await this.request({ method: 'PUT', url, headers });
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const url = `${this.url}&comp=appendblock`;
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate(),
      'x-ms-blob-condition-appendpos': start,
      'x-ms-blob-condition-maxsize': this.size
    };
    await this.request({ method: 'PUT', body, headers, url });
    return this.responseStatus > 201 ? start : end;
  }

  override abort(): void {} // Azurite does not support blob upload interrupts ?!

  override async getOffset(): Promise<number | undefined> {
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate()
    };
    await this.request({ method: 'HEAD', headers, url: this.url });
    if (this.responseStatus === 200) {
      return Number(this.responseHeaders['content-length']) || 0;
    }
    this.url = '';
    return this.offset || 0;
  }
}
function getISODate() {
  return new Date().toISOString();
}

@ARosentiehl24
Copy link

ARosentiehl24 commented Aug 7, 2024

Hi @akshaybogar1984, were you able to get it working with a Web Api in C#? I am facing the same problem as you. My stack is the same Angular and .NET and I have a hybrid storage so I am not only going to upload files in Blob Storage, I can also use local storage. That's why the @kukhariev approach won't work for me.

@ARosentiehl24
Copy link

ARosentiehl24 commented Aug 7, 2024

I've found something I'm doing this

`
[HttpPost]
public async Upload
{
// validation
// generate uid from JSON paylod metadata
// construct the UploadChunk URL from uid
// store metadata
// create a new blob
// send UploadChunk Url in Location header
// return 201 Created
}

[HttpPut('{uid}')]
public async UploadChunk(string uid)
{
// validation
// Append chunk to Blob Storage
// update metadata
// send the range header and status 308 or 200 if it's the last chunk of the file.

}
`

And looks like something like this

`
[HttpPost]
[AllowAnonymous]
public async Task Upload([FromBody] UploadMetadata metadata)
{
if (metadata == null)
{
return BadRequest("Invalid metadata.");
}

var uid = Guid.NewGuid().ToString();

var uploadChunkUrl = Url.Action(nameof(UploadChunk), new { uid });

await metadataService.StoreMetadataAsync(uid, metadata);

Response.Headers["location".ToLower()] = "https://localhost:7081" + uploadChunkUrl;

return StatusCode((int)HttpStatusCode.Created);

}

[HttpPut("{uid}")]
[AllowAnonymous]
public async Task UploadChunk(string uid)
{
if (string.IsNullOrEmpty(uid) || Request.Form.Files.Count == 0)
{
return BadRequest("Invalid request.");
}

var file = Request.Form.Files[0];

var metadata = await metadataService.GetMetadataAsync(uid);
if (metadata == null)
{
    return NotFound("Metadata not found.");
}

metadata.CurrentSize += file.Length;
await metadataService.UpdateMetadataAsync(uid, metadata);

if (metadata.CurrentSize >= metadata.TotalSize)
{
    return Ok("Upload complete.");
}
else
{
    Response.Headers["Range"] = $"bytes=0-{metadata.CurrentSize - 1}";
    return StatusCode(308, "Upload in progress.");
}

}

`

But for some reason I'm always getting this issue

image

image

which is weird because I am sending the url through the header

image

@kukhariev
Copy link
Owner

kukhariev commented Aug 7, 2024

@ARosentiehl24,
this response headers miss Access-Control-Expose-Headers: Location, Range

Response.Headers["Location".ToLower()] = "https://localhost:7081" + uploadChunkUrl;
Response.Headers["Access-Control-Expose-Headers".ToLower()] = "Location,Range" 

return StatusCode((int)HttpStatusCode.Created);

Also var file = Request.Form.Files[0]; is incorrect, should use the Request.Body stream

@ARosentiehl24
Copy link

ARosentiehl24 commented Aug 8, 2024

That's right, now it's working as expected I'll let here the way how can be implemented, this is just an example.

image

image

Just a little question, is the Response.Headers["Range"] = $"bytes=0-{metadata.CurrentSize - 1}"; well implemented?

@kukhariev
Copy link
Owner

Looks good, but missing Response.Headers["Access-Control-Expose-Headers".ToLower()] = "Range"

@ARosentiehl24
Copy link

Sure, thanks! works like a charm.

@akshaybogar1984
Copy link
Author

thanks @ARosentiehl24 let me give try today

@akshaybogar1984
Copy link
Author

can we upload 10gb file to blob storage

@akshaybogar1984
Copy link
Author

@ARosentiehl24 Please share server how are uploading to blob. thanks

@ARosentiehl24
Copy link

Hello @akshaybogar1984 , sure, my approach is the following

I'm using the BlockBlobClient class

image

You need to store the blockIds somewhere during the upload process, in my case I'm using SQL Server

image

I'm storing the size cause that's how I know when I complete the upload by comparing some values

and at the end of the upload process, you can do something like this

image

You must send the list of the Ids to make the commit and with that you can make the upload by blocks of heavy files.

Hope it helps.

Greetings.

@akshaybogar1984
Copy link
Author

@ARosentiehl24 thanks alot. what is the maximum size are you trying to upload.. in my case 10gb to 20 gb

@akshaybogar1984
Copy link
Author

akshaybogar1984 commented Aug 13, 2024

@kukhariev can we have blockid and blocklist approach. so that i can upload 10gb in faster way

PUT https://myaccount.blob.core.windows.net/mycontainer/myblob?comp=block&blockid=AAAAAA%3D%3D HTTP/1.1  
  
Request Headers:  
x-ms-version: 2018-03-28  
x-ms-date: Sat, 31 Mar 2018 14:37:35 GMT    
Authorization: SharedKey myaccount:J4ma1VuFnlJ7yfk/Gu1GxzbfdJloYmBPWlfhZ/xn7GI=  
Content-Length: 0
x-ms-copy-source: https://myaccount.blob.core.windows.net/mycontainer/myblob
x-ms-source-range: bytes=0-499
=======================================================

PUT https://myaccount.blob.core.windows.net/mycontainer/myblob?comp=blocklist HTTP/1.1  
  
Request Headers:  
x-ms-date: Wed, 31 Aug 2011 00:17:43 GMT  
x-ms-version: 2011-08-18  
Content-Type: text/plain; charset=UTF-8  
Authorization: SharedKey myaccount:DJ5QZSVONZ64vAhnN/wxcU+Pt5HQSLAiLITlAU76Lx8=  
Content-Length: 133  
  
Request Body:  
<?xml version="1.0" encoding="utf-8"?>  
<BlockList>  
  <Latest>AAAAAA==</Latest>  
  <Latest>AQAAAA==</Latest>  
  <Latest>AZAAAA==</Latest>  
</BlockList>

@kukhariev kukhariev reopened this Aug 13, 2024
@kukhariev
Copy link
Owner

@akshaybogar1984,
are you asking about the modified example
#454 (comment) using the blocklist approach?

@kukhariev
Copy link
Owner

working example, keeps a list of blocks in memory

import { Uploader } from 'ngx-uploadx';

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     chunksize: 100 * 1024 * 1024,
 *     endpoint: `[signedURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  blockList: string[] = [];
  override async getFileUrl(): Promise<string> {
    const oUrl = new URL(this.endpoint);
    oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
    const url = oUrl.toString();
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const bid = this.uploadId + String(start).padStart(15, '0');
    const blockId = btoa(bid);
    const blockUrl = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
    await this.request({ method: 'PUT', headers: commonHeaders(), url: blockUrl, body });
    this.blockList.push(blockId);
    if (end === this.size) {
      await this.finish();
    }
    return this.responseStatus > 201 ? start : end;
  }

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    const url = this.url + `&comp=blocklist`;
    const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
    await this.request({ method: 'PUT', headers, url, body });
  }

  override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!

  override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      console.log(this.response);
      // TODO: parse blocklist
    } catch {}
    return this.offset || 0;
  }
}

function commonHeaders(apiVersion = '2022-11-02') {
  return {
    'x-ms-version': apiVersion,
    'x-ms-date': new Date().toISOString()
  };
}

@akshaybogar1984
Copy link
Author

akshaybogar1984 commented Aug 13, 2024

@kukhariev what is the maximum chunksize, can we have [concurent thread concept >](concurrency: 2, // maximum number of parallel transfer workers)

chunksize: 100 * 1024 * 1024,

also from this
chunk-T6SYERLG.js?v=748897e8:69

   GET https://***.blob.core.windows.net/**/MDR%20679949-2024-06-21%2009-05.xml?sv=****&comp=blocklist&blocklisttype=all

exception:

BlobNotFound
The specified blob does not exist.
RequestId:29724631-701e-0042-20a2-edaa7c000000
Time:2024-08-13T17:00:09.1040664Z

override async getOffset(): Promise<number | undefined> {
const url = this.url + &comp=blocklist&blocklisttype=all;
const headers = commonHeaders();
try {
await this.request({ headers, url });
console.log(this.response);
// TODO: parse blocklist
} catch {}
return this.offset || 0;
}

@kukhariev
Copy link
Owner

kukhariev commented Aug 13, 2024

dynamic chunk size by default is unlimited, use maxChunkSize option to limit.
concurrency - maximum number of simultaneously uploaded files.

BlobNotFound is fine. It is part of the upload resume logic.

import { Uploader } from 'ngx-uploadx';

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     maxChunkSize: 512 * 1024 * 1024,
 *     endpoint: `[signedURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  blockList: string[] = [];
  override async getFileUrl(): Promise<string> {
    const oUrl = new URL(this.endpoint);
    oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
    const url = oUrl.toString();
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const blockId = btoa(this.uploadId + String(start).padStart(15, '0'));
    const url = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
    const headers = commonHeaders();
    await this.request({ method: 'PUT', headers, url, body });
    this.blockList.push(blockId);
    if (end === this.size) {
      await this.finish();
    }
    return this.responseStatus > 201 ? start : end;
  }

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    const url = this.url + `&comp=blocklist`;
    const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
    await this.request({ method: 'PUT', headers, url, body });
    return this.size;
  }

  override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!

  override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      const parser = new DOMParser();
      const xmlDoc = parser.parseFromString(this.response, 'text/xml');
      const blocks = xmlDoc
        .getElementsByTagName('UncommittedBlocks')[0]
        .getElementsByTagName('Block');
      const sizes = Array.from(blocks).map(
        el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
      );
      return sizes.reduce((acc, v) => acc + v, 0);
    } catch {}
    return this.offset || 0;
  }
}

function commonHeaders(apiVersion = '2022-11-02') {
  return {
    'x-ms-version': apiVersion,
    'x-ms-date': new Date().toISOString()
  };
}

@akshaybogar1984
Copy link
Author

@kukhariev I have tried between 10 mb to 500mb file. request gets stuck and doesnt process at all

@kukhariev
Copy link
Owner

@ARosentiehl24
you can clone https://github.com/kukhariev/ngx-uploadx/tree/blob-exp,
edit the signed URL of a container

sasUrl:
'http://127.0.0.1:10000/devstoreaccount1/container1?sv=2018-03-28&spr=https%2Chttp&st=2024-07-25T10%3A05%3A20Z&se=2028-01-26T10%3A05%3A00Z&sr=c&sp=racwdl&sig=0SPSW3G1X5qrdGrIIfaCj%2ByLrRl6g5EfjdG7aaQK7rA%3D'

npm run serve:dev

and navigate to http://localhost:4200/service-code-way

@kukhariev
Copy link
Owner

Closing due to inactivity. Feel free to re-open if your issue isn't resolved.

@AlbertoRosentiehlTR
Copy link

AlbertoRosentiehlTR commented Oct 28, 2024

Hello @kukhariev , me again, I'm trying to use the BlopUploader and I'm testing with 1GB file and I got this error at the end

image

Do you have any idea?

I'm using the code you put here

dynamic chunk size by default is unlimited, use maxChunkSize option to limit. concurrency - maximum number of simultaneously uploaded files.

BlobNotFound is fine. It is part of the upload resume logic.

import { Uploader } from 'ngx-uploadx';

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     maxChunkSize: 512 * 1024 * 1024,
 *     endpoint: `[signedURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  blockList: string[] = [];
  override async getFileUrl(): Promise<string> {
    const oUrl = new URL(this.endpoint);
    oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
    const url = oUrl.toString();
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const blockId = btoa(this.uploadId + String(start).padStart(15, '0'));
    const url = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
    const headers = commonHeaders();
    await this.request({ method: 'PUT', headers, url, body });
    this.blockList.push(blockId);
    if (end === this.size) {
      await this.finish();
    }
    return this.responseStatus > 201 ? start : end;
  }

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    const url = this.url + `&comp=blocklist`;
    const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
    await this.request({ method: 'PUT', headers, url, body });
    return this.size;
  }

  override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!

  override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      const parser = new DOMParser();
      const xmlDoc = parser.parseFromString(this.response, 'text/xml');
      const blocks = xmlDoc
        .getElementsByTagName('UncommittedBlocks')[0]
        .getElementsByTagName('Block');
      const sizes = Array.from(blocks).map(
        el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
      );
      return sizes.reduce((acc, v) => acc + v, 0);
    } catch {}
    return this.offset || 0;
  }
}

function commonHeaders(apiVersion = '2022-11-02') {
  return {
    'x-ms-version': apiVersion,
    'x-ms-date': new Date().toISOString()
  };
}

another thing is that when it starts for some reason it tries to get the file that is supposed to be uploaded

image

and the cancel operation fails too, well, the delete call

image

and finally when the process fails for a file, that file can't be uploaded again for some reason, it stays somewhere because it shows this, it's the same 1GB file but there's no way the file can be uploaded that fast lol .

image

@kukhariev kukhariev reopened this Oct 28, 2024
@kukhariev
Copy link
Owner

kukhariev commented Oct 28, 2024

Do you have any idea?

check if body is correct XML

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    console.log(body)

another thing is that when it starts for some reason it tries to get the file that is supposed to be uploaded

it is fine. It is part of the upload resume logic.

and the cancel operation fails too, well, the delete call

maybe you sharedKey doesn't contain 'Delete' permission

and finally when the process fails for a file, that file can't be uploaded again for some reason, it stays somewhere because it shows this, it's the same 1GB file but there's no way the file can be uploaded that fast lol .

it's okay. It's a resumable upload

P.S. If you don't need support resuming an interrupted upload:

  override async getOffset(): Promise<number | undefined> {
    return this.offset || 0;
  }

@ARosentiehl24
Copy link

ARosentiehl24 commented Oct 28, 2024

This is the body

image

Do you see something wrong?

finish <?xml version="1.0" encoding="utf-8"?><BlockList><Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDEwNDg1NzYwMA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDEwNTkwNjE3Ng==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDEwODAwMzMyOA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDExMjE5NzYzMg==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDEyMDU4NjI0MA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDEzNzM2MzQ1Ng==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDE3MDkxNzg4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDI3NTc3NTQ4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDM4MDYzMzA4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDQ4NTQ5MDY4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDU5MDM0ODI4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDY5NTIwNTg4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDgwMDA2MzQ4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMDkwNDkyMTA4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMTAwOTc3ODY4OA==</Latest>,<Latest>MTRiODU4NmQ1MDA3ZjA0YTAwMDAwMTExNDYzNjI4OA==</Latest></BlockList>

I have added the Delete permission and it's working, thanks, and the rest of the topics, well, I'm ok with that but I'd like to keep the resumable logic, it's just that I don't know where that file is located then...

I mean the file it seems to be created, but when?

image

Shouldn't call the finish action?, I'm not sure but it seems that the commit step missing for this resumable uploads

@kukhariev
Copy link
Owner

Do you see something wrong?

possible commas, try join('')

const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join('');

@ARosentiehl24
Copy link

ARosentiehl24 commented Oct 28, 2024

Thanks, it's working now, but any ideas about the file that failed? because I can't upload the file again. How can I get resumable uploading to work? well, to complete the 1GB file that failed first.

@kukhariev
Copy link
Owner

possible improvement:

override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      const parser = new DOMParser();
      const xmlDoc = parser.parseFromString(this.response, 'text/xml');
      const blocks = xmlDoc
        .getElementsByTagName('UncommittedBlocks')[0]
        .getElementsByTagName('Block');
      const sizes = Array.from(blocks).map(
        el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
      );
      const offset = sizes.reduce((acc, v) => acc + v, 0);
      if (offset === this.size) {
        await this.finish();
      }
      return offset;
    } catch {}
    return this.offset || 0;
  }

@ARosentiehl24
Copy link

I'll try it and let you know if I have another issues, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants