-
Notifications
You must be signed in to change notification settings - Fork 489
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upload Large File bug #762
Comments
weired thing is, the online ipfs release using the same version doesn't have crash problem. although still consume a lot of memory. |
I test the problem using 40MB and 700MB file, 40 works fine and 700 crashs. it might differ because of the amount of total memory |
Hello @gitGksgk! The current version of WebUI is going to be replaced soon. Could you try and see if the error persists on revamp branch? We know there are some issues with large files. |
Sure, many thanks for your reply. Unfortunately the error persists. I suspect it's because |
it turns out that it's probably because of some kind of memory leak. In chrome dev tools a lot of 1MB JSArrayBufferData is observed when uploading. The last sentence traceable is In node_modules\pull-file-reader\index.js . It seems to be designed sending while reading the uploading file but the buffer persists so that the memory consumption becomes larger and larger until it crashes. |
@gitGksgk I was searching and this seems to be a known issue. Not sure if it's the same one, but I'll take a look at WebUI's code to see if something might be missing. Although, the most probable thing is that this is the same issue as ipfs-inactive/js-ipfs-http-client#654. |
hi @hacdias Thanks for your reply. yeah I've reached that issue before. I don't know if this project uses some part of js-ipfs-api component implicitly, but I can't find http-api in this project, which that issue has modified to fix it's uploading problem. |
@gitGksgk we're using |
hi @hacdias Thanks for your reply. Further investigation shows that in src\bundles\files.js , function const res = await ipfs.add(file.content, {
pin: false,
// eslint-disable-next-line
progress: (bytes) => {
sent = sent + bytes - alreadySent
alreadySent = bytes
updateProgress(sent / totalSize * 100)
}
}) reads the whole file into buffer first, and then send to ipfs over stream. Thus the "progress" item won't start until end of reading file. This is the same way as the old version does. |
@gitGksgk basically you're saying that if we removed the progress, it would not consume so much memory? |
nope, I am saying that the memory consumption ended before the progress starts. It's file.content that costs the consumption. in src\bundle\files.js there's a function named filesToStreams, defined in src\lib\files.js, however it didn't convert file to stream. Rather than split a large file into small pieces, it just split possibly multiple uploaded files into single file... |
file.content is a function reading the whole file content into memory. Read buffer is not piped to send program. That's why it consumes so much memory. |
@gitGksgk so you're saying the issue is on |
@hacdias My understanding is that this additional iterating over Instead of multiple calls to: const res = await ipfs.add(file.content, { you should pass const res = await ipfs.add(streams, { Similar use case in Companion uses a single call: https://github.com/ipfs-shipyard/ipfs-companion/blob/b149f99425067affec4eede838b51f598e5fc373/add-on/src/popup/quick-upload.js#L84 I think multiple calls to In WebUI case you probably want to unwrap files after upload, so after a single Hope this helps :) |
@lidel but then, wouldn't we lose directory structure? Directories inside directories? |
@hacdias I think You should be able to pass |
@lidel Yes there's a multiple function call but changing it to besides, when uploading file of 1.6GB the crush doesn't happen immediately after reading file to buffer though the memory usage grows very high. The crash happens in |
@gitGksgk could you make a simple script with the only required modules to make it crash please? |
@lidel I'll take a look at simplifying the upload too! |
@hacdias I'm glad to, but I don't quite understand. The module has a lot of dependencies. How can i use a simple script to reproduce the crash? |
@gitGksgk actually it might not be needed. I was taking a look at And it seems to read the whole file and them split it into chunks. But then, on js-ipfs-api, there are also some related issues: ipfs-inactive/js-ipfs-http-client#654 /cc @alanshaw |
At the time that module was written I don't think there was a way to stream chunks from FileReader - I don't know if there is a way now. It would be great to get it upgraded to do real streaming. Similarly for the upload bug - if someone could look into it and submit a fix that would be amazing. I would merge that so fast ;) |
Quick summary for drive-by visitors: @hacdias fixed "multiple calls to Let's keep this open until https://github.com/ipfs/js-ipfs-api/issues/842 is addressed upstream. |
@alanshaw don't know if it helps. But there's a filereader-stream module. I've tested. the reading process aren't seem to be memory hungry (but not as successful of course, otherwise a pr would have been sent...), while the |
@gitGksgk we've found out that the issue is not with |
Note to self: this needs revisiting after we switch webui to the latest js-ipfs-http-client (with async iterators API) |
Related with #1534 |
I believe this was fixed in #1534 |
in components\files\action-bar.js OnFilesChange: It reads the entire file first and then do things. This cause a huge memory consumption when uploading large file. In Chrome it will crash the tab and in Firefox it almost causes my system to crash..... Any suggestions? thanks.
The text was updated successfully, but these errors were encountered: