-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading large files crashes the browser #96
Comments
Happened to me as well. The RAM used goes way up and then the browser crashes. |
While this issue is being fixed are there any alternatives to uploading larger than 35 MB fies? |
If you're running IPython locally, or it's on a server where you have some other kind of access (ssh, ftp, nfs, Windows file sharing), you can just copy the files you want to the directory where the server is running. If your only access to the server is through IPython, I don't think there's any alternative. |
Bumping to 5.0. The actually for look that loop the file bytes that encode them as base 64 is crashing. |
@Carreau It seems something like resumable.js would help with this. It allows for chuncked and resumable uploads, so it should handles large files fine. This require modifications on the server to store the chunks in a temporary directory and reassemble the files at the end. |
Also pinging @blink1073 on this |
We could return a particular error status code if the file is too large, and then use the HTML File API directly as a fallback. |
Well, FileReader, to be more precise. |
Nevermind, I see that you're already using the |
Mitigated for 4.1 by #623, which disables upload of large files. Leaving open for 5.0, when we can actually implement chunked upload. |
We have solved the problem of loading large files by chunking. We also added a progress bar to indicate the upload progress. In our experiment we tried files with a few Gs with no problem. I would be happy to contribute the code. Please let me know what's the process. Thank you! |
Hi @daniel1124 this was given a shot in #536 currently IIRC the server side does not support receiving the files in chunk so that would need to be improved From the UI side of things, I guess the best path forward may be to directly implement that in jupyterlab which will end up replacing the current notebook, but if you want to try to implment that in normal notebook as well that would be appreciated. One would have to look how to do chunk file upload with tornado (likely just peaking at the Does that make sens ? |
In my opinion this should be done in the Notebook first. The bulk of the work will be in the server and the low-level services calls on the client. A PR will need to be made against this repository either way. A bootstrap progress bar in a bootstrap modal could be used for the Notebook UI, and we can use the new server API and much of the front-end code for the JupyterLab implementation. |
@blink1073 yeah that's exactly what we did. We modified notebook-4.3.1, mostly in the notebook/static/tree/js/notebooklist.js file, and we add a largefilemanager.py under notebook/service/contents/. Do you think I should follow the instructions here to contribute? https://github.com/jupyter/notebook/blob/master/CONTRIBUTING.rst |
@Carreau @blink1073 the way we make it work is that if it is the first chunk, then we set the write mode to "w"; for following chunks we set the write mode to "a". All my changes are confined under the notebook package. |
That sounds great, @daniel1124, and yes, that is the preferred process document. |
@Carreau @blink1073 I have finished the code and testing, etc. Right now I'm waiting on our company (BlackRock)'s legal/compliance team to give an approval for pull request. We will be contributing through BlackRock-engineering account https://github.com/blackrock. This is my first time making pull request. Just to understand the process, does the following sound right:
|
@daniel1124 - that process sounds right. Give or take a possible 3.5 - respond to comments from reviewers ;-) |
Great, looking forward to it @daniel1124! |
Thanks @daniel1124 If you or your company need to chat privately for question clarification feel free to reach to any of us privately or contact the Jupyter steering council. As for your pull-request, you can't break anything so feel free to send even a partial work, it is always possible to update after the fact. |
@Carreau @blink1073 @takluyver I have added unit tests and cleaned up the code. It's ready for review now. |
@Carreau @blink1073 @takluyver just made another commit reflecting all the feedback. It's ready for review. |
Fix for uploading large files crashing the browser (issue #96)
Thanks for all your works , I'm waiting for upload large Files. |
Apologies, I was offline for a week or so and I am currently caching up with things. We'll have a look, right now the notebook codebase is frozen until we publish 5.0, (we are at rc2, so things should be quick), we'll get that in for 5.1. |
The PR for this (#2162) was actually merged for 5.0, despite what we said here. |
Yes there is , u can use filezilla , U just need to get the ssh keys from your sys admin ...... jupyter added it for more user convenience. |
getting the same issue |
If you try to upload a file that is larger than ~35mb the browser tab will crash about 5 seconds after clicking "Upload".
If we could warn the user about large files or implement a file uploader the user expirience would be better.
The text was updated successfully, but these errors were encountered: