-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chuked file on web api #454
Comments
The backend must respond to the POST request by specifying the chunks upload url in the https://github.com/kukhariev/node-uploadx/blob/master/proto.md#requests-overview |
My requirement is, upload the file in chunks(20gb) to blob storage. |
could you help me to work this on my machine ? |
You can add @Component({
selector: 'app-upload-component',
standalone: true,
imports: [RouterOutlet, UploadxDirective],
template: `
<input type="file" [uploadx]="options" (state)="onUpload($event)" />
`
})
export class AppUploadComponent {
options: UploadxOptions = {
endpoint: '[URL]',
uploaderClass: Tus
};
onUpload(state: UploadState) {
console.log(state);
}
} |
Thanks. But using front end as Angular and Backend as web api c#. So I would like to know when agnular pushes chunks to server how web api would recieved chunks in order to upload to blobl storage ? |
Sorry, i have almost no knowledge of either C# or blob storage and don't have access to Azure :( While I'm on vacation, I'm going to try to work towards doing blob storage support. Is there currently a working server or For example: [HttpPost]
public async Upload
{
// validation
// generate uid from JSON paylod metadata
// construct the UploadChunk URL from uid
// store metadata
// create a new blob
// send UploadChunk Url in Location header
// return 201 Created
}
[HttpPut('{uid}')]
public async UploadChunk(string uid)
{
// validation
// Append chunk to Blob Storage
// update metadata
// send the range header and status 308 or 200 if it's the last chunk of the file.
} |
I have server(web api c#) but unable to upload using ngx-uploadx. |
@akshaybogar1984 , import { Uploader } from 'ngx-uploadx';
const env = {
sasToken:
'sv=2023-08-03&ss=bfqt&srt=sco&se=2030-12-12T08%3A11%3A41Z&sp=rwdlacup&sig=%2F9bU8%2FkdnDe2bVMRSVZWGJc5QlR58nvCEeCEfOXzjA0%3D',
containerURL: 'http://127.0.0.1:10000/devstoreaccount1/container1'
};
/**
* Azure Blob Storage support
* @example
* options: UploadxOptions = {
* allowedTypes: 'image/*,video/*',
* chunksize: 4 * 1024 * 1024,
* endpoint: `[containerURL]`,
* uploaderClass: BlobUploader
* };
*/
export class BlobUploader extends Uploader {
override async getFileUrl(): Promise<string> {
const headers = {
'x-ms-version': '2022-11-02',
'x-ms-date': getISODate(),
'x-ms-blob-type': 'AppendBlob'
};
const url = `${env.containerURL}/${this.file.name}?${env.sasToken}`;
await this.request({ method: 'PUT', url, headers });
return url;
}
override async sendFileContent(): Promise<number | undefined> {
const { body, start, end } = this.getChunk();
const url = `${this.url}&comp=appendblock`;
const headers = {
'x-ms-version': '2022-11-02',
'x-ms-date': getISODate(),
'x-ms-blob-condition-appendpos': start,
'x-ms-blob-condition-maxsize': this.size
};
await this.request({ method: 'PUT', body, headers, url });
return this.responseStatus > 201 ? start : end;
}
override abort(): void {} // Azurite does not support blob upload interrupts ?!
override async getOffset(): Promise<number | undefined> {
const headers = {
'x-ms-version': '2022-11-02',
'x-ms-date': getISODate()
};
await this.request({ method: 'HEAD', headers, url: this.url });
if (this.responseStatus === 200) {
return Number(this.responseHeaders['content-length']) || 0;
}
this.url = '';
return this.offset || 0;
}
}
function getISODate() {
return new Date().toISOString();
} |
Hi @akshaybogar1984, were you able to get it working with a Web Api in C#? I am facing the same problem as you. My stack is the same Angular and .NET and I have a hybrid storage so I am not only going to upload files in Blob Storage, I can also use local storage. That's why the @kukhariev approach won't work for me. |
I've found something I'm doing this ` [HttpPut('{uid}')] } And looks like something like this `
} [HttpPut("{uid}")]
} ` But for some reason I'm always getting this issue which is weird because I am sending the url through the header |
@ARosentiehl24, Response.Headers["Location".ToLower()] = "https://localhost:7081" + uploadChunkUrl;
Response.Headers["Access-Control-Expose-Headers".ToLower()] = "Location,Range"
return StatusCode((int)HttpStatusCode.Created); Also |
Looks good, but missing |
Sure, thanks! works like a charm. |
thanks @ARosentiehl24 let me give try today |
can we upload 10gb file to blob storage |
@ARosentiehl24 Please share server how are uploading to blob. thanks |
Hello @akshaybogar1984 , sure, my approach is the following I'm using the BlockBlobClient class You need to store the blockIds somewhere during the upload process, in my case I'm using SQL Server I'm storing the size cause that's how I know when I complete the upload by comparing some values and at the end of the upload process, you can do something like this You must send the list of the Ids to make the commit and with that you can make the upload by blocks of heavy files. Hope it helps. Greetings. |
@ARosentiehl24 thanks alot. what is the maximum size are you trying to upload.. in my case 10gb to 20 gb |
@kukhariev can we have blockid and blocklist approach. so that i can upload 10gb in faster way
|
@akshaybogar1984, |
working example, keeps a list of blocks in memory import { Uploader } from 'ngx-uploadx';
/**
* Azure Blob Storage support
* @example
* options: UploadxOptions = {
* allowedTypes: 'image/*,video/*',
* chunksize: 100 * 1024 * 1024,
* endpoint: `[signedURL]`,
* uploaderClass: BlobUploader
* };
*/
export class BlobUploader extends Uploader {
blockList: string[] = [];
override async getFileUrl(): Promise<string> {
const oUrl = new URL(this.endpoint);
oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
const url = oUrl.toString();
return url;
}
override async sendFileContent(): Promise<number | undefined> {
const { body, start, end } = this.getChunk();
const bid = this.uploadId + String(start).padStart(15, '0');
const blockId = btoa(bid);
const blockUrl = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
await this.request({ method: 'PUT', headers: commonHeaders(), url: blockUrl, body });
this.blockList.push(blockId);
if (end === this.size) {
await this.finish();
}
return this.responseStatus > 201 ? start : end;
}
async finish() {
const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
const url = this.url + `&comp=blocklist`;
const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
await this.request({ method: 'PUT', headers, url, body });
}
override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!
override async getOffset(): Promise<number | undefined> {
const url = this.url + `&comp=blocklist&blocklisttype=all`;
const headers = commonHeaders();
try {
await this.request({ headers, url });
console.log(this.response);
// TODO: parse blocklist
} catch {}
return this.offset || 0;
}
}
function commonHeaders(apiVersion = '2022-11-02') {
return {
'x-ms-version': apiVersion,
'x-ms-date': new Date().toISOString()
};
} |
@kukhariev what is the maximum chunksize, can we have [concurent thread concept >](concurrency: 2, // maximum number of parallel transfer workers) chunksize: 100 * 1024 * 1024, also from this
exception:
override async getOffset(): Promise<number | undefined> { |
dynamic chunk size by default is unlimited, use
import { Uploader } from 'ngx-uploadx';
/**
* Azure Blob Storage support
* @example
* options: UploadxOptions = {
* allowedTypes: 'image/*,video/*',
* maxChunkSize: 512 * 1024 * 1024,
* endpoint: `[signedURL]`,
* uploaderClass: BlobUploader
* };
*/
export class BlobUploader extends Uploader {
blockList: string[] = [];
override async getFileUrl(): Promise<string> {
const oUrl = new URL(this.endpoint);
oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
const url = oUrl.toString();
return url;
}
override async sendFileContent(): Promise<number | undefined> {
const { body, start, end } = this.getChunk();
const blockId = btoa(this.uploadId + String(start).padStart(15, '0'));
const url = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
const headers = commonHeaders();
await this.request({ method: 'PUT', headers, url, body });
this.blockList.push(blockId);
if (end === this.size) {
await this.finish();
}
return this.responseStatus > 201 ? start : end;
}
async finish() {
const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
const url = this.url + `&comp=blocklist`;
const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
await this.request({ method: 'PUT', headers, url, body });
return this.size;
}
override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!
override async getOffset(): Promise<number | undefined> {
const url = this.url + `&comp=blocklist&blocklisttype=all`;
const headers = commonHeaders();
try {
await this.request({ headers, url });
const parser = new DOMParser();
const xmlDoc = parser.parseFromString(this.response, 'text/xml');
const blocks = xmlDoc
.getElementsByTagName('UncommittedBlocks')[0]
.getElementsByTagName('Block');
const sizes = Array.from(blocks).map(
el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
);
return sizes.reduce((acc, v) => acc + v, 0);
} catch {}
return this.offset || 0;
}
}
function commonHeaders(apiVersion = '2022-11-02') {
return {
'x-ms-version': apiVersion,
'x-ms-date': new Date().toISOString()
};
} |
@kukhariev I have tried between 10 mb to 500mb file. request gets stuck and doesnt process at all |
@ARosentiehl24
npm run serve:dev
and navigate to http://localhost:4200/service-code-way |
Closing due to inactivity. Feel free to re-open if your issue isn't resolved. |
Hello @kukhariev , me again, I'm trying to use the BlopUploader and I'm testing with 1GB file and I got this error at the end Do you have any idea? I'm using the code you put here
another thing is that when it starts for some reason it tries to get the file that is supposed to be uploaded and the cancel operation fails too, well, the delete call and finally when the process fails for a file, that file can't be uploaded again for some reason, it stays somewhere because it shows this, it's the same 1GB file but there's no way the file can be uploaded that fast lol . |
check if body is correct XML async finish() {
const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
console.log(body)
it is fine. It is part of the upload resume logic.
maybe you sharedKey doesn't contain 'Delete' permission
it's okay. It's a resumable upload P.S. If you don't need support resuming an interrupted upload: override async getOffset(): Promise<number | undefined> {
return this.offset || 0;
}
|
This is the body Do you see something wrong?
I have added the Delete permission and it's working, thanks, and the rest of the topics, well, I'm ok with that but I'd like to keep the resumable logic, it's just that I don't know where that file is located then... I mean the file it seems to be created, but when? Shouldn't call the finish action?, I'm not sure but it seems that the commit step missing for this resumable uploads |
possible commas, try const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join(''); |
Thanks, it's working now, but any ideas about the file that failed? because I can't upload the file again. How can I get resumable uploading to work? well, to complete the 1GB file that failed first. |
possible improvement: override async getOffset(): Promise<number | undefined> {
const url = this.url + `&comp=blocklist&blocklisttype=all`;
const headers = commonHeaders();
try {
await this.request({ headers, url });
const parser = new DOMParser();
const xmlDoc = parser.parseFromString(this.response, 'text/xml');
const blocks = xmlDoc
.getElementsByTagName('UncommittedBlocks')[0]
.getElementsByTagName('Block');
const sizes = Array.from(blocks).map(
el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
);
const offset = sizes.reduce((acc, v) => acc + v, 0);
if (offset === this.size) {
await this.finish();
}
return offset;
} catch {}
return this.offset || 0;
} |
I'll try it and let you know if I have another issues, thanks! |
HI, I am getting this issue as I unable to post the document on web api. please help me
I can see the payload but where I can get the chuked file ?
payload:
{"name":"10_MB.MP4","mimeType":"video/mp4","size":10485760,"lastModified":1720538165694}
endpoint: @https://localhost:7177/api/WeatherForecast/Upload1?uploadType=uploadx`
The text was updated successfully, but these errors were encountered: