-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple sources of LSP messages on frontend and backend #184
Comments
Another thing here: a motivating case for this would be including (at least one) webpack-compiled language server by default for the language that we all have to deal with: markdown. It should be wired up for using that for markdown cells... and perhaps optionally in code block comments (though python folk are still rewarded for rst, but that looks like a bear to package). Markdown will really stress the nested language (#191) work, as well... for example, de-facto Jupyter markdown includes support for LaTeX math. It would be interesting to see if there is a reasonable-to-embed language server for that sub-dialect... again, many of the options look hard to package, but perhaps one could theoretically compile digestif and lua to WASM (buh). |
tcp language server: I tried to get jupyterlab-lsp running with the wolfram language server from @kenkangxgwe. Right now this wolfram language server only supports tcp connections and no stdio connection, although the author seems to be trying to support stdio connections in future releases. So +1 for the tcp language server support. |
Thanks for the heads-up on that. The big thing I want tcp for is being able
to launch a pyls (kind of a DIY language server too kit) inside a running
ipython kernel to interactively develop new language servers.
It's really not that big of a change: we basically need to:
- subclass the Reader and Writer classes
- pick the appropriate client (presumably there's already one)
- handle any asynchrony
- add a config flag
- add some tests
- release!
I don't imagine we'll test with the Wolfram one, but a number of open
source servers support tcp, for sure. If we did get it working, we could at
least add the detection logic for it!
…On Fri, May 8, 2020, 08:47 The3DWizard ***@***.***> wrote:
*tcp language server*: I tried to get jupyterlab-lsp
<https://github.com/krassowski/jupyterlab-lsp> running with the wolfram
language server <https://github.com/kenkangxgwe/lsp-wl> from @kenkangxgwe
<https://github.com/kenkangxgwe>. Right now this wolfram language server
only supports tcp connections and no stdio connection, although the author
seems to be trying to support stdio connections in future releases. So +1
for the tcp language server support.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<https://github.com/krassowski/jupyterlab-lsp/issues/184#issuecomment-625798784>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAALCREI5D7HCZTLWHKXAXTRQP5ORANCNFSM4KMM3DIQ>
.
|
Hey there, I am currently looking into this (for the Wolfram LS actually). Might even be as easy as using socket.makefile() and reusing the existing stdio reader and writer. However, could you be a little bit more specific about this part:
I already added the mode=stdio/tcp flag in the spec, but I'm not sure what would be the correct way of specifying the port (and potentially the host). I mean for instance the Wolfram LS requires the argument -tcp=, so one could pick any free port for this and hard-code it into the args in the specs file. I presume you mean something like a placeholder which is then set by the extension during process execution/initialization? So for instance we would reserve the tag Of course we also need to pass the port into the extension such that it can open the TCP connection when creating the reader and writer (and potentially fill in the placeholder in the args) and this is were I'm stuck. Where would I put this and how would I access it? Also in the specs? Or should there be some magic which will search for a free port and use that one instead? What if the LS does not allow changing the port (don't know if that would be the case for any LS)? |
For my specific use case of interactively developing a language server locally I'd want this in-loop with For the case of a local TCP LS which has its entire lifecycle managed by the server extension
For a remote connection... well, that's a whole other kettle of wax. I don't know how you'd get away with anything other than a hard-coded host/port.
as in: changed the schema? we'd need to be very careful here, as that's the contract between the frontend and backend. Further, it should be possible to add this to the v2 schema, as it's only adding some properties, so they can't be properties:
mode:
description: the connection technique. if absent, assumes stdio
type: string
enum:
- stdio
- tcp
port:
description: the port for tcp mode connections. a null value will select a random, unused port
default: null
oneOf:
- type: null
- type: number
format: integer
host:
description: the host for tcp mode connections. a null value will assume 127.0.0.1
oneOf:
- type: null
- type: string in some other places in the stack, unix sockets are also supported, but i don't know if any language servers support that.
Perhaps be wary going down this road, as the docs say
Probably use either |
Thank you very much for your pointers, they are greatly appreciated!
Sorry I don't really understand what schema v2 vs v1 is. Currently I added
inside
I see. To bad, would have been too easy this way ;-) I already had a look at the TCP connection methods within asyncio, before I found the
Thanks for clarifying. I will have a look at jinja2 and the way free port detection was done in the acceptance test. |
That The As long as all of our specs, and any open source ones we know of in the wild (countable on one hand after trip to the ER on an exciting
Ah,
well, i guess jinja has the danger of folk getting too fancy... if what needs to be done can be done with
sorry, was a bit strapped for time, earlier. It's implemented here and is pretty straightforward. It would just live in some |
All right. I guess this shouldn't be to difficult by just making
Ok, then I will see whether I can move the process creation stuff to anyio first. After a first glance on that method it seems that there is no |
Wanted to point out that https://github.com/jtpio/jupyterlite is looking like the most compelling way forward for static hosting of a lab experience. Having LSP features in a static (set of) document(s) would be killer. There's an issue (https://github.com/jtpio/jupyterlite/issues/40) for offering bespoke routes on the "server", comms and files (e.g. LSIF). I'm kind of inclined to favor targeting (ha!) the comms (rather than faking our home-rolled websockets), as then the browser kernels could offer both the jupyter messages as well as LSP, which would get around them having to coordinate their language versions, filesystems, etc. |
Elevator Pitch
Enable things other chains than...
...to complete LSP requests.
Motivation
We've done a lot of work to make the first roundtrip work. However, there are a number of sources of messages that would provide more flexibility and robustness, and in some cases end-user simplicity.
Design Ideas
the frontend language server
FrontendLSPConnection extends ILSPConnection
packages/jupyterlab-language-server-xyz
, with enough boilerplate to wrap/patch its guts (e.g.fs
over contents REST API)the tcp language server
pygls
, i really wanted to just pop open a tcp port and and add it to my language servers... but we only supportstdio
TcpReader
andWriter
, addmode
to the spec, allow a{{ port }}
pass-through in the argsthe kernel language server
lintotype
, we can define a comm and let kernels stuff LSP into it%
syntax, and where real files are (except for the notebook you're working on 🤣 )KernelLSPConnection implements ILSPConnection
the static language server
LSIFConnection implements ILSPConnection
. add a button for "save workspace", and be able to open an LSIF as a new language serverThe text was updated successfully, but these errors were encountered: