-
Notifications
You must be signed in to change notification settings - Fork 190
feat: parallel chunkedUpload in web and node #1146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -67,6 +67,7 @@ function getUserAgent() { | |||||||||
|
|
||||||||||
| class Client { | ||||||||||
| static CHUNK_SIZE = 1024 * 1024 * 5; | ||||||||||
| static MAX_CONCURRENCY = 6; | ||||||||||
|
|
||||||||||
| config = { | ||||||||||
| endpoint: '{{ spec.endpoint }}', | ||||||||||
|
|
@@ -211,38 +212,104 @@ class Client { | |||||||||
| return await this.call(method, url, headers, originalPayload); | ||||||||||
| } | ||||||||||
|
|
||||||||||
| let start = 0; | ||||||||||
| let response = null; | ||||||||||
| const totalChunks = Math.ceil(file.size / Client.CHUNK_SIZE); | ||||||||||
|
|
||||||||||
| while (start < file.size) { | ||||||||||
| let end = start + Client.CHUNK_SIZE; // Prepare end for the next chunk | ||||||||||
| if (end >= file.size) { | ||||||||||
| end = file.size; // Adjust for the last chunk to include the last byte | ||||||||||
| } | ||||||||||
| const firstChunkStart = 0; | ||||||||||
| const firstChunkEnd = Math.min(Client.CHUNK_SIZE, file.size); | ||||||||||
| const firstChunk = file.slice(firstChunkStart, firstChunkEnd); | ||||||||||
|
|
||||||||||
| headers['content-range'] = `bytes ${start}-${end-1}/${file.size}`; | ||||||||||
| const chunk = file.slice(start, end); | ||||||||||
| const firstChunkHeaders = { ...headers }; | ||||||||||
| firstChunkHeaders['content-range'] = `bytes ${firstChunkStart}-${firstChunkEnd - 1}/${file.size}`; | ||||||||||
|
|
||||||||||
| const firstPayload = { ...originalPayload }; | ||||||||||
| firstPayload[fileParam] = new File([firstChunk], file.name); | ||||||||||
|
|
||||||||||
ChiragAgg5k marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||
| const firstResponse = await this.call(method, url, firstChunkHeaders, firstPayload); | ||||||||||
|
|
||||||||||
| if (!firstResponse?.$id) { | ||||||||||
| throw new Error('First chunk upload failed - no ID returned'); | ||||||||||
|
||||||||||
| throw new Error('First chunk upload failed - no ID returned'); | |
| throw new Error( | |
| `First chunk upload failed - no ID returned. Response: ${JSON.stringify(firstResponse)}` | |
| ); |
Copilot
AI
Aug 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error object is being directly concatenated to a string, which may not provide meaningful error information. Consider using f.error.message || f.error.toString() for better error reporting.
| const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`); | |
| const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error?.message || f.error?.toString()}`); |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -296,6 +296,7 @@ class {{spec.title | caseUcfirst}}Exception extends Error { | |||||||||||||||||||||||||
| */ | ||||||||||||||||||||||||||
| class Client { | ||||||||||||||||||||||||||
| static CHUNK_SIZE = 1024 * 1024 * 5; | ||||||||||||||||||||||||||
|
||||||||||||||||||||||||||
| static CHUNK_SIZE = 1024 * 1024 * 5; | |
| static CHUNK_SIZE = 1024 * 1024 * 5; | |
| /** | |
| * The maximum number of concurrent requests allowed. | |
| * | |
| * The default value of 6 is chosen as a balance between maximizing throughput | |
| * and minimizing server load or rate-limiting issues. Increasing this value | |
| * may improve performance for high-bandwidth clients or servers, but can | |
| * also lead to higher resource usage and potential throttling by the server. | |
| * Decreasing it can reduce load but may slow down operations that require | |
| * multiple concurrent requests (such as multipart uploads). | |
| */ |
Copilot
AI
Aug 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error message 'First chunk upload failed - no ID returned' could be more helpful by including the actual response or HTTP status code to aid debugging.
| throw new Error('First chunk upload failed - no ID returned'); | |
| throw new Error( | |
| `First chunk upload failed - no ID returned. Response: ${JSON.stringify(firstResponse)}` | |
| ); |
Copilot
AI
Aug 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error object is being directly concatenated to a string, which may not provide meaningful error information. Consider using f.error.message || f.error.toString() for better error reporting.
| const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error}`); | |
| const errorMessages = failures.map(f => `Chunk ${f.chunkInfo.index}: ${f.error && (f.error.message || f.error.toString())}`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The MAX_CONCURRENCY constant lacks documentation explaining why 6 was chosen as the default value and how it affects performance or server load.