Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doesn't display memory value on my jupyter notebook #17

Closed
ajay2611 opened this issue Apr 12, 2019 · 16 comments
Closed

Doesn't display memory value on my jupyter notebook #17

ajay2611 opened this issue Apr 12, 2019 · 16 comments

Comments

@ajay2611
Copy link

I've got a multi-user environment using the jupyter notebook on a server. This extension is not giving me memory value used by that Jupyter notebook. I've shared the screenshot.

Can you help me to find out what am I missing?

Screen Shot 2019-04-12 at 1 21 25 PM

@cosmosart
Copy link

You probably need "psutil"

Try this
pip install psutil

@blaza1
Copy link

blaza1 commented May 16, 2019

I've installed psutil but i'm still getting this error. I'm running a jupyter notebook with an image spawned by dockerspawner

@SanderLam
Copy link

SanderLam commented Jul 29, 2019

I had the same issue after the total memory allowance for a Jupyter Notebook was changed.

Because the server had to be restarted with updating the memory allowance, my local packages were not installed anymore.

What worked for me was, after changing the memory allowance, re-running the following statements:

pip install jupyter_contrib_nbextensions && jupyter contrib nbextension install
pip install nbresuse

@waterflyer
Copy link

@SanderLam Hi Sander, I followed your solution but still couldn't display the memory value correctly, so I wonder what's the process of changing the memory allowance? I have checked the available RAM in my system is 11Gi, could you help me out? Thank you.

@krinsman
Copy link
Contributor

Could you look at the output of the Javascript console? On Chrome the keyboard shortcuts can be found here: https://developers.google.com/web/tools/chrome-devtools/shortcuts

This is hard to debug without knowing whether it's an issue with the notebook server extension (backend, Python) or the NBExtension (Javascript, frontend).

@ThierrySpetebroot
Copy link

Hello @krinsman , I'm having the same issue (both on a kubernetes deployment and on the Anaconda local installation on Windows).

From the console I don't see anything abnormal concerning nbresuse:

image

Any idea of where I could find more information?

@krinsman
Copy link
Contributor

The GET about:blank net::unknown:URL:SCHEME is normal? I'm not great at front end development.

I guess another thing you could look at it would be the requests panel of developer tools -- ideally nbresuse should be making HTTP requests to <base_url>/metrics, so if those HTTP requests are failing, returning 500, etc., then the server isn't returning the metrics information to the frontend. If everything is working with that though then it is probably a Javscript issue.

To be absolutely clear, when we say "jupyter notebook", do we mean Jupyter Lab or the classic notebook?

Currently the Javascript for displaying information from NBResuse for Jupyter Lab is in a separate repository (for the memory usage, the Jupyter Lab master should have it, for CPU usage also, see here https://github.com/tslaton/jupyterlab-cpustatus/blob/master/src/index.tsx ). The nbextension only works (to the extent it works at all) for classic notebook.

I think the "Network" tab of Chrome developer tools is what displays HTTP requests, although I'm not 100%. I don't know where to find that information for Firefox but I imagine Google could probably find out quicker than I can (I should probably switch back to Firefox sometime but never get around to it).

I know none of this is really helpful to you though. I apologize for not being able to do better at the moment.

@ThierrySpetebroot
Copy link

Thank you for your help.

Indeed I am using the classic notebook. Thanks for the clarifications about Jupyter Lab anyway.

The <base_url>/metrics requests are returning 200.
The information about the memory is the following:
image

Concerning the HTTP requests, the only failure is this one (see GET about:blank net::unknown:URL:SCHEME):
image

Looking at the initiator I would say it is likely not related to nbresuse (but not sure about it).

@krinsman
Copy link
Contributor

OK, then that suggests the problem is somewhere in the Javascript for the nbextension
https://github.com/yuvipanda/nbresuse/blob/master/nbresuse/static/main.js

Unfortunately this is probably the one part of the code I know the least about + understand the least, so I'm not sure I will be able to help here.

@ThierrySpetebroot
Copy link

Thanks for the heads up, I think I've found why it is not working for me.

The <base_url>/metrics requests requires a JSON object:
https://github.com/yuvipanda/nbresuse/blob/dd2d21c0832cd7d8e41b91b2e8782e8eda11464b/nbresuse/static/main.js#L28

While the request is executed correctly, the callback is never called.
Looking at the response (see screenshot in my previous message), the payload is NOT in JSON format.

I have no clue what is going on on the backend side as I am not familiar at all with notebook extensions (perhaps something is using the same URL?)

I've tried this on my vanilla Anaconda installation (Jupyter Notebook version: 6.0.1) and the same happens.

@ThierrySpetebroot
Copy link

Looking at the response format it seems that it is a Prometheus output (I am not familiar with it either).

My reference for the output format was (the issue and project are not related at all to this issue):
https://github.com/pyouroboros/ouroboros/issues/329#issue-478896239

@ThierrySpetebroot
Copy link

From change log of Jupyter Notebook (version 6.0): Add /metrics endpoint for Prometheus Metrics.

All versions of Jupyter Notebook from 6.0 and above are likely impacted by this issue.
Prometheus metrics are exported on the same URL as the one that nbresuse uses (i.e., <base_url>/metrics).

@ThierrySpetebroot
Copy link

Suggested solution: use a different URL for exposing and retrieving the metrics.

Affected lines:

@krinsman
Copy link
Contributor

@ThierrySpetebroot Good sleuthing and detective work! You've convinced me at least that this is the issue. Can you try making a fork of the repo and changing metrics in those lines to resuse or something like that to confirm that fixes the issue?

With regards to whether it can be merged, that depends on the long-term path that nbresuse takes, which hasn't been decided yet. For example, some people want nbresuse to also report usage with prometheus (which seemingly would make nbresuse redundant for notebook?) see this PR: #22 as well as this issue: jupyterlab/jupyterlab#7663

Another potential issue is that right now nbresuse is used as the default for reporting memory usage to JupyterLab, and JupyterLab expects the information to come from the metrics endpoint. So it seems like the quick fix for notebook, changing the name of the endpoint, would break JupyterLab, and since notebook is the legacy solution and JupyterLab is intended as the long-term solution that probably isn't something that we can merge.

However, if we can find a solution that works for both JupyterLab and classic notebook using e.g. probably Prometheus, that would seem to be the ideal scenario.

The status of NBResuse is in flux/limbo right now since Yuvi no longer has the bandwidth to maintain it and wants to transfer the repository to another organization, but right now no one has committed to supporting it yet. Since I have write access I guess I could make and take responsibility for arbitrary decisions regarding the future progress of NBResuse which may or may not come back to haunt the project, although I wouldn't really feel comfortable with that. Especially in this case where to be honest I know almost nothing about Prometheus except that it reports metrics, and so wouldn't be qualified to review code purporting to make NBResuse work via Prometheus instead of something else.

So in short, for the short-term I recommend forking the repo (and making a PR too so we have a written record of the experiment) that changes the name of the endpoint to confirm that the proposed changes work. For the long-term I have no clue and would like to get feedback from other people before proceeding.

@krinsman
Copy link
Contributor

Has this been fixed as of NBResuse versions 0.3.3 and 0.3.4 on PyPI?

pip install nbresuse[resources]

Provisionally I am closing this issue now since I can't currently reproduce it -- please let me know if it should be reopened.

@jurgispods
Copy link

Still seeing this issue on version 0.3.6. Any workarounds?

davidbrochart pushed a commit to davidbrochart/jupyter-resource-usage that referenced this issue Dec 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants