-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JupyterLab 3 update #400
Comments
You are welcome to PR! I am happy to help and review, though do not have time to work on it myself this month. I expect that upgrade to 3.0 should solve quite a few issues. We use a little bit more complex build system and have plugins depend on each other, so it might not be as easy as with other packages, so your help as of someone with experience in upgrades would be more than appreciated! FYI, I just merged #381 to prevent potential merge conflicts and make sure CI will pass. |
Yeah... This sucker is not lightweight, and we're happy for more hands and
fewer forks, while trying to keep shipping updates for users. I have been
trying to do _simple_ things with cm vs jl3rc, and it's already tough.
Please let us know how we can help... If a meeting/workshop makes sense, i
can carve out a day.
Some thoughts:
A big thing in this will be a thorough assessment of what we can share with
the debugger.
Another freaky wrinkle is the now-bitrotten proxy kernel pr... I think we
generally agree we want it, but a lot has happened since I got to work on
it. The wins here are that it would:
- remove the notebook dependency, clearing the path for jupyter_server
compat
- remove any hard runtime vscode is deps
- remove home-rolled websocket connection management in favor of comms,
which mostly just work
- decouple the locality/perf/lifecycle of the tornado server from the
language servers
- make it more "jupyter-like", as in more interactive and inspectable
- provide a path for willing kernels to go hybrid and provide lsp over a
well known target
At the expense of:
- adding a hard (initial) ipython or (eventual) xeus dep
- trading server extension problems for kernel spec ones
Related: having in-browser language servers will further reduce our
reliance on npm/nodejs for some use cases... e.g lab settings editor. But
making contentsmanager look like fs will be... Hard.
I don't see any of the above as a hard blockers to getting something out,
but might make sense to consider them in this work, as we'll have a unique
opportunity during this migration... e.g. byebye py35!
|
I've looked a bit at your extensions and saw that the With the new JupyterLab extension system, the preferred way is to publish packages on PyPi. I guess your "sub-packages" could be only packaged for npm, and the Another approach could be to have all of your plugins published on PyPi. But it might be a bit more work. IDK if this approach makes sense, I guess it makes sense only if it's useful for users to be able to install those sub-packages separately with pip. What do you think? I can come up with a PR that makes one Python package for |
I do not know how much modularity is currently possible with the mixed approach, see jupyterlab/jupyterlab#9289. The intent of using multiple sub-packages (reflecting the JupyterLab core approach) was two-fold:
The (2) never came to be, so we would not lose much if it would not be currently possible, and it seems ok to me to make a compromise of porting the extension faster, as long as it will still be possible to ship individual components separately in the future. Rephrasing: having everything on PyPI in separate granular packages is preferable if it works. If it does not work because of upstream issues, let's have just one PyPI package for now. |
I guess jupyterlab/jupyterlab#9289 won't be an issue for you if you don't depend on a third-party labextension (and at first glance I see that you don't?).
This is definitely doable. The approach might be different though, all sub-packages will activate their plugins themselves when JupyterLab 3 starts, instead of having the |
Yes, I think it would be great. @bollwyvl what do you think? |
First off: thanks all for this discussion! Representing two user stories:
Would it be possible there would be one pypi distribution with multiple importable names? If lab 3 is doing something with distribution names, I think everybody's in for a world of hurt. There are 100+ language server features we haven't even started! I don't see us wanting to publish that many packages, even with more robust automation #183. I don't really care if the package manager is having to grab a few extra mbs at install time for this-feature-and-that-feature if it's not getting shipped to the browser once disabled (by configuration, or by downstream). On the npm side: I think we'll have to keep publishing, but do a better job of clearly defining what's internal and external APIs. Back over to python: From the the packaging complexity side, let's not use
Next steps: |
I guess that we should keep
IDK actually. If I understand correctly, you are thinking of something like: # Install everything
pip install jupyter-lsp
# Install only one sub-package
pip install jupyter-lsp:theme-material
pip install jupyter-lsp:lsp-ws-connection ?
Actually no, you won't have to keep publishing the npm packages. Unless other npm packages depend on them on the JavaScript side (if they use the JavaScript code). |
To give you an idea of how it would look like for one sub-package, I've opened a PR that updates the Once published, one would only need to run |
One simpler way to get to JupyterLab 3 support quickly without changing the way everything is packaged would be to update the NPM packages correctly, then the main I guess I prefer this approach for now, as we would not take any big decision on the packaging now. I guess I can easily open a PR doing this. |
Hopefully somewhat more concretely:
If a new user can:
pip install jupyterlab-lsp
And then:
pip install some_diagnostics_panel_replacer
And get the "right" behavior, then everything is fine.
- If today that _can_ mean 1+1 packages, and tomorrow 1+1+1/feature,
without changing the "fast path" docs, it sounds like 1+1 is easier for us
and users
- I assume, to get our tokens, the other diagnostics developer needs our
fine-grained tokens from the npm packages
If a user MUST (or CANNOT):
jupyter labextension disable @krassowski/jupyterlab-lsp-diagnostics-panel
For any given feature, we might have lost, as:
- requiring it means our metadata is insufficient
- disallowing it means hub admins won't be able to cook up docker images
the way they want by, e.g overrides json
|
Motivation
JupyterLab 3 is coming, with a new extension system. I've been updating multiple packages around, and I have the need for this extension to be updated as well.
Is there anyone working on it already? Or should I come with a PR myself?
The text was updated successfully, but these errors were encountered: