-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slow completion for large Python packages when using the default pyls #272
Comments
me too, it usually takes long time |
Does it take longer for you than in another editor using the same language server (e.g. vim with pyls/VS Code with pyls), or is this the same in all editors? |
Hi @krassowski, it takes longer even in VS code with pyls, but just for the first time, if you try to write it again it is immediately autocompleted, but not in Jupyterlab. It is a bit faster in Jupyterlab after first try, but not as in VS Code. |
Hi, @krassowski ,i deploy it at my remote server (ubuntu 18.08) at the first time, it's too slow, about 10s+ for a hint. I thought it might be the network problem. But I deploy it at my PC(win10), it's still very slow(few secs), but better then my server. |
Thank you for your feedback. I guess it might be hardware dependent but then I did see very slow completion of one laptop which had good specs but was running Windows. Maybe we could make use of caching in a way? Please feel free to suggest your own solution ideas! |
@krassowski First of all, great extension! Not perfect yet, but never thought all these features would be possible in jupyter notebooks! The slow autocomplete is the only problem I encountered (on windows). I tried turning the jupyterlab-lsp extension off and as a result the autocomplete was much faster (using tab key of course). When I turn on the extension, the autocomplete becomes very slow again. Hope there is a solution to this. |
Have the same issue sadly, so I understand it is not user related? |
Hi, I wish to share my experience here.
Any upgrade to newer versions of these packages, i.e.
I hope this helps further investigation! @krassowski, let me add that this extension is really useful and I am grateful it's available. Honestly, with JupyterLab it's still a bit laggy, I am sure it could get better, I hope you can continue improving it! |
I have the same issue, seems slower than the default autocomplete... |
I have the same problem. Running JupyterLab on Linux here (i7 6700K), and autocomplete is taking 5-10 seconds. |
There are several things that we can do to help:
|
In addition to the list above (now updated to four ideas for tackling this issue), there is an upstream issue for recent pyls versions: palantir/python-language-server#823 It might be that some speed up will be seen once the upstream issue is resolved. If you are tracking this issue and would like to help but don't feel like learning typescript, head over to the |
My completion is also pretty slow, on top of that dot does not trigger automatic completion. Is there a setting to switch this on? Also pressing tab does not work for dot and one char, but works for two or more characters: For matplotlib, I can autocomplete with one character: |
@quantum-booty the continuous completion (a.k.a hinterland) is an opt-in feature and you need to enable it in the Advanced Settings Editor → Code Completion: I suspect that you do not see auto-completion (i.e. completion just after typing dot) for Now to explain things a bit, it is not a surprise that the completions load slower with LSP:
Sorry that the experience is far from perfect! I do my best (with the great help of Nick!) but at the end of the day we are just doing this in our free time, and cannot move faster (although we would very much welcome any help!) |
Hi @krassowski , From my side, I just enjoy it and try to give feedback to make sure things get even better... |
Thanks, @sntgluca! The feedback with a reproducible example like the one from @quantum-booty helps to narrow down what is the root cause fo the issues. As for your comment on difficulty upgrading and parso errors - this is not something that has to do with this extension but with the pyls language server. they just had a new release hopefully fixing various errors (though there are some known issues with completion caching and diagnostics) - have a try upgrading and if you encounter any problems please do open a separate issue. I cannot help without detailed environment information. |
Would anyone here like to try running the jedi cache script and see if it makes a difference? Please adjust |
|
Regarding the icons, I can happily live without them, I'd be fine with texts inplace of the icons. |
Hi @krassowski , In my case I came to the conclusion that lsp autocompletion is strongly bound to I/O and networking (remote NFS + jupyterhub proxy + VPN) and I don't think there's much to do to drastically improve that... I am thinking of some potential workarounds which could work for cases like mine:
Looking forward to your opinion |
FYI, Additionally I noticed a major difference with your snapshots above: in my case 'parsing' the instructions - as shown with highlighting - is really laggy and can take more than one second per statement. No error indication in the JS console. Just plain slow. Hope it helps as a data point :) |
Could disabling completion on dot improve performance? Maybe add a setting to trigger completion 1 or 2 or 3 characters after the dot. Maybe make dot trigger depending on library size, for a big library like numby, trigger 1 or 2 char after dot. For a small library, trigger completion on dot. |
Hi @bollwyvl , @krassowski , |
@krassowski Thanks for the brilliant work! I tried https://github.com/krassowski/jupyterlab-lsp/issues/272#issuecomment-775726255 and it works well for local kernels, faster autocompletion, impressive. However, with remote ipython kernels I installed with e.g. remote_ikernel, there's no autocompletion. |
@g6ai please open a new bug report and provide all the details including the debug output from kernel. |
Yeah, once you go remote, the topology will get weird between what we know about, what your kernels know about, and what your language servers knows about. We're not really ready for that, partially see: https://github.com/krassowski/jupyterlab-lsp/issues/184#issuecomment-777682465 |
How do I turn off completion but keep the rest of jupyterlab-lsp working? |
I also would like to know how do I turn off completion but keep the rest of jupyterlab-lsp working. |
Hi @krassowski , It turns out pyls is constantly consuming more than 50% of CPU time, trying to index all python files in my home, which are plenty as I have several python files and separate environments, similarly to what is described in palantir/python-language-server#421 Do you have any configuration suggestion to limit this behaviour? |
@sntgluca does it occur using my pyls fork, or the upstream? How do you know that it is indexing files? Is it coming from a specific pyls plugin? If this is an issue with pyls, then please open a detailed issue upstream, but chance is that it may be with a specific pyls plugin - then please open it in appropriate repository. |
Dear @krassowski , I wanted to check with you first, maybe there was an obvious solution. Thanks for following up with me quickly anyway. And to answer your questions: I have been using I can confirm the same behaviour occurs both with your fork I don't know which pyls plugin is involved or how to check it... sorry. |
Thanks! FYI, we will be moving towards the https://github.com/python-lsp/python-lsp-server fork soon (once I get to send PRs that provide the performance improvements). At very least, you can check which plugins are installed using |
https://github.com/krassowski/jupyterlab-lsp/issues/272#issuecomment-774806496 brought my completion time for a Dask Gateway |
So now that all of the primary upstream pyls dependencies support |
Hi! Just one more note, please. Can you guys comment on the status of the https://github.com/python-lsp/python-lsp-server integration, and your current suggestions on the current best practices / configuration? Thanks for all the help! |
No, none of my performance improvements got into pylsp yet. If anyone feels like picking up please do, I am happy to advise but at capacity right now. |
Though as a side note, one can now easily checkout the pyright server and compare performance as this one does not use jedi. From what I saw we are now hitting the point with pyright where it is our rendering and not the server that is the culprit of the delay (especially icon SVG parsing which is - ridiculously - done each time rather than only once per icon type). |
The big news is that it is now possible to disable completions from specific source (e.g. kernel or LSP servers) in the just released 3.7 version. |
How to disable lsp for some specific language, and use kernel completions. |
Hi, I wish to share one additional note which I think could be of interest: This means, for me, that you're doing a wonderful job, but without integrating the latest features to the downstream packages, we won't get all the benefits. I'd love to be able to help out but I am not good enough to contribute... |
Hi @potoo0 sorry I missed your message earlier. You can disable completion for pylsp only by modifying its settings: {
"language_servers": {
"pylsp": {
"serverSettings": {
"pylsp.plugins.jedi_completion.enabled": false,
"pylsp.plugins.rope_completion.enabled": false
}
}
}
} @sntgluca it's on my todo-list later this month. |
Quick update: the two pull requests with major performance improvements to python-lsp-server were merged yesterday. With the default settings you should get a substantial speed up. You can try it out by installing directly from GitHub:
There were also some improvements for the rendering of icons in #625. |
For anyone reading this comment in future, please note that the fixes are available only in my personal pip uninstall python-language-server
pip install -U jupyterlab jupyterlab-lsp nbclassic python-lsp-server |
@krassowski Would it be better if I |
Description
Autocompletion - especially of bigger packages like pandas is slow.
Reproduce
import pandas as pd
pd.<tab>
Is it usual behavior or there is something wrong configured?
Expected behavior
Autocompletion finished nearly immediately :)
Context
Required: installed server extensions
Required: installed lab extensions
Troubleshoot Output
Command Line Output
Browser Output
No errors
Thanks for help!
The text was updated successfully, but these errors were encountered: