-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large notebook fails to save #650
Comments
Hi @glider-gun. Thank you for the detailed issue report. The details are really helpful to our developers. I believe that the scenario and error message that you are seeing is hitting the default @glider-gun I don't know if using Python 3 and a more recent version of IPython would have the same limitation. If you are able to test easily, please do. If not, no worries. @minrk @Carreau Is there a way to workaround the default max_body_size limit by chunking the body (https://github.com/tornadoweb/tornado/blob/eaf34865a63460cdd64abd1ae2c8835b174c6e93/tornado/http1connection.py#L346) or setting a different body_size limit (https://github.com/tornadoweb/tornado/blob/eaf34865a63460cdd64abd1ae2c8835b174c6e93/tornado/http1connection.py#L324)? |
We might want to look at this problem while working on #536 |
Thank you for quick reply! environments for python 2.7.10 with IPython 4.0.0:
OS, browser, all library versions are same as my first comment environments for python 3.4.3 with IPython 4.0.0:
(the user name part in this quote is replaced by hand) |
@glider-gun Thanks for the additional info. For now, I recommend keeping an eye on #536 as suggested by @Carreau. In the interim, I wonder if saving more frequently would be a reasonable workaround to the limitation. |
I see, thank you. |
It looks like tornado imposes a maximum size of 100MB for HTTP requests by default, and I don't think we currently override that anywhere: In the long run, the fix will be to maintain notebook models on the server, so we don't have to send the whole notebook over HTTP at once. But we should probably increase that limit as an interim measure. |
Any solution? I have a large notebook that I want to save. |
@davidcortesortuno and I are also having this problem with Holoviews HoloMaps, where it's quite easy to go over 100mb. |
We temporarily fixed this by modifying the |
Having the same problem here... |
Shall we bump the limit up for 5.0? Does anyone have a guide as to what a sensible limit might be? |
You should be able to set a larger
I don't have a good notebook to test with, but the rationale:
|
@gnestor That didn't help me, I'm still getting a Oh, also I'm doing this over HTTPS if that makes a difference. |
@SamuelMarks Can you try upgrading to notebook 5.0.0rc2 ( |
@gnestor Weird, can't get it to work at all now. Even tried in a new virtualenv:
But still getting:
Edit: wait am I meant to use |
Did you try just |
@gnestor - Okay, got it to work with latest Same |
@takluyver I think these default limits should suffice for now. Let's close this and for reference, if any users are encountering this issue (not being able to save a notebook due to file size), you can increase the limit by editing these lines: https://github.com/jupyter/notebook/blob/master/notebook/notebookapp.py#L237-L238 |
For anyone else finding this before #3829 is actually merged, the only solution in this thread that actually currently works in notebook 5.6.0 is to modify the Trying to pass the arguments to jupyter notebook when starting it doesn't work, nor does editing the |
@gnestor I've encountered a similar bug. Initially, I got the error of After much debugging, I believe that this is being triggered by the large amounts of images that I am saving. Once the number of images in notebook goes above a certain threshold everything shuts down without warning. Any ideas? |
@kevinlu1211 Yeah, I basically have the same issue, but I think it's just due to the browser running out of memory, as chrome only allocates up to 1.8 GB of memory per tab by default. Watch the memory usage when it runs, if it dies after growing to about that size, that's probably your problem. Fortunately, you can adjust this as described here, which has thus far fixed my issue, though I suspect I will hit it again if the tab reaches ~3.5 GB. |
@j-andrews7 I don't think it was my browser reaching the memory limit, but regardless I did the fix though it still didn't any other ideas? |
@kevinlu1211 nope, sorry mate. Maybe try a different browser? |
You could maybe try:
to reduce the resolution of the images? |
When I was using IPython notebook to analyze our experiment data, I noticed I could not save notebook.
The console (from which I started
ipython notebook
) stated:So I guess this problem comes from notebook size.
I was using "bokeh" library to plot my data, and the notebook file was about 100 MB on disk.
To reproduce, I prepared a new notebook and did many plot to produce a large-filesize notebook.
This does 30001-point plot repeatedly (e.g. 100 plots in above screen shot,) I could not save the notebook above: when I repeated saving with increasing number of plots, again above about 100MB, I could not save the notebook (with the same console message).
In a little more detail, I could save notebook until 88 plots, when notebook file size was 104756892 byte or 99.904 MB. And I could not save it with 89 plots. By increasing number of plots, file size increased about 1.1 MB per one plot.
I searched issue list, but could not find about this.
Is this limit intentional? Is there some work around for this problem (without removing cells from notebook)?
My environments are:
OS: Mac (OSX, 10.9.5 Mavericks)
Browser: Safari 9.0 (9537.86.1.56.2)
matplotlib python library ver. 1.4.3
numpy python library ver. 1.9.2
bokeh python library ver. 0.10.0
The text was updated successfully, but these errors were encountered: