-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VS Code crashes when running a cell that produces a lot of output #11031
Comments
Thanks for filing this issue and I'm sorry you are running into this.
|
@DonJayamanne |
Thanks for your patience & I'm sorry about this issue, hopefully we can get to the bottom of this. I'm assuming you are referring to the cell with the code
I have a suspicion the output from the second cell is very large & that could be chewing up resources. What I'd like to figure out is:
Finally
|
Thanks for all of the information, please could you:
|
The zip folder should contain everything you need. The small demo notebook at around 100KB and a notepad with all my packages, you can run pip install with that notepad and it will install everything for you. Also I am using venv, python version is 3.10.2, tensorflow version is the latest. If you can't reproduce it, I can take a video with my phone and show you step by step what is happening. @DonJayamanne Let me know if you need anything else and I would be more than happy to assist |
@mariosconsta Can you try the following:
Basically I'd like to get the two notebooks. |
Sorry for the late reply, I will test this out ASAP and let you know. |
So with VS Code closed. I opened jupyter notebook and run the cell that outputs model summary. The cell ran instantly without trouble. I opened VS Code to try it again and the same thing happened. I will take a video with my phone as its happening and upload it here as well. |
@DonJayamanne There's also 2 screenshots on how VS Code looks after it recovers along with the error that sometimes pops up. The error doesn't pop up always. https://drive.google.com/file/d/1GzOu8ZdX9CTZeN7guRcpZpYaWco5B445/view?usp=sharing |
Will get a custom build of he extension shortly, so you can test with that |
Please could you:
Please let me know how this goes. |
I will test it out first thing in the morning! |
@DonJayamanne Runs instantly as shown in the screenshot below. Like you said there's no output, but it does run like a charm and uses zero ram. Is there a way to get output from this version? Or should I continue using the version I currently am on. P.S. I tried printing it as well to see if I get output but I didn't. The original code was just
Should I try switching to jupyter pre-release version on VS Code? (not insiders, the normal one) EDIT: So using the pre-release version of jupyter did not fix the problem. VS Code (main version not the one you sent) still uses 100% of my memory and after that, I get that visual bug. Like I said, this does not happen on insiders version and with the file you shared. Also it does not happen on stand alone jupyter notebook. The cell might say 0.5s but it wasn't 0.5s, it took more than 15s to complete. EDIT: I am willing to do an AnyDesk session with you if you want to see it "live" if the recordings I sent you were not enough. In conclusion, the version you sent works like a charm, the main version does not, this text is all over the place and I am sorry for that. |
@mariosconsta Please could you try installing this VSIX as well and letting me know whether this still crashes. |
Thanks, so far what we have going is working, i'm still trying to gather some more information before I start working on this issue. |
Just like the last VSIX, works like a charm. The cell runs instantly without issues. |
Also like before, no output. Could this be the issue? The output? I did some testing and I was able to output large text without issue using a for loop. You can check the recording here: |
Internal note: Blocked on upstream issue microsoft/vscode#146768
@mariosconsta I'm sorry, but I dont follow you.
Currently the extension host seems to crash due to the size of the output generated. |
hi, has this problem been solved? @DonJayamanne I have same problem right now. I have 24 GB ram and have recorded a video while problem is happening. But weird thing is that when i try to open same notebook in Google Colab everything is perfect. No memory consumption and even more i can view all images very fast by sliding scroll bar very very fast and as a result no freeze, no stuck, no problem. I think the problem might be all about loading all data into memory(or memory management), not from storage device(HDD/SSD). Question is how Google Colab can achive such a good performance. Versions:
Thanks. vs_code_memory.mp4 |
@livan3li Also, please could your disable all extensions, including Jupyter and see if you run into this issue |
Ok. I've disabled all extensions related with Jupyter after this it's been opened as plain text file. Here is the notebook link in drive which i have used in the video.
|
What i meant was, disable all extensions (including Jupyter), but still open as a notebook. |
@rebornix aren't you already looking into some perf improvements for opening notebooks with large number of outputs? |
Sorry for that. I've done exactly what you said. I disabled all extensions (even Python) and tried to open the notebook but same thing happened again. VS Code consumes all available memory. |
you can check the subprocess under main process 'vscode' . see which subprocess consumes all that memory. I found out, on my end, the python subprocess under 'vscode' consumes loads of memory. |
Since i disabled all extensions, Python doesn't do that for me. But now under VS Code, again VS Code consumes all memory. |
Have you reported bug where it needs to be or i need to do that? |
@tyePhDCandy
This will confirm that the freeze is due to loading of the notebook |
I dont have any installed Jupyter Notebook app other than VS Code.
vs_code_bug.mp4 |
For microsoft/vscode-jupyter#11031 The VS Buffer for the output items has a short byte length but a very large backing buffer (`Uint8Array` which is actually a `Buffer`) When sending outputs to the webview, we ended up transfering the huge backing buffer instead of the much smaller actual data We can fix this by creating a Uint8Array copy of the data and then transfering it to the webview
Speed an transfer of outputs to notebook webviews For microsoft/vscode-jupyter#11031 The VS Buffer for the output items has a short byte length but a very large backing buffer (`Uint8Array` which is actually a `Buffer`) When sending outputs to the webview, we ended up transfering the huge backing buffer instead of the much smaller actual data We can fix this by creating a Uint8Array copy of the data and then transfering it to the webview
Also experiencing this issue - super frustrating. Context: notebook essentially freezes when I print too much to a console or when trying to display a large pandas dataframe (by simply typing "df" in a cell and running it"). I have 104gb of ram and the dataframes are only consuming about 7gb. |
|
This issue should now be fixed in latest Insiders and will be available in next week's Stable release. We will revisit this issue once you all get the latest fixes and reopen if necessary. |
Tested successfully with a dummy controller that produces 10000000 lines textual output. That might have done it but having author verification or crisp verification steps would help |
Hi everyone! I've run into the white walls of doom just after a recent vscode update, on my Mac M1 Pro. Due to compliance reasons i'm unable to provide code examples, but some of the issues others have reported seem to keep popping up:
I hope this information is at least a little bit useful! |
I have a repro here microsoft/vscode#182636 @jrieken |
I currently having this issue with version 1.81.0 😭😞🤬 |
I am trying to print model summary in tensorflow and I think the model is large and it's crashing my notebook. The model is ResNet101.
The whole computer comes to a halt, memory usage goes up to 99% and VS Code crashes. I have 16 GB of ram, so I didn't think printing something large would actually eat all my ram. Also, because the kernel crashes, all the variables are lost like history = model.fit() which I need to fine-tune the model afterwards. Moreover, I need to print base_model summary in order to choose from which layer to fine-tune from.
Is there a way to print the summary in another way and save the entire notebook with the variables, so I can continue working? I have checkpoints for model weights, but I need to keep track of past epochs through history to resume training afterwards.
I will try using
.py
files, but I want to know if there is a way to solve this problem for jupyter.The text was updated successfully, but these errors were encountered: