-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fancy remote debugging (aka tab completion and editor controls) #130
Labels
debugger
discussion
enhancement
New feature or request
experiment
Exploratory design and testing
help wanted
Extra attention is needed
Comments
goodboy
added
discussion
enhancement
New feature or request
experiment
Exploratory design and testing
help wanted
Extra attention is needed
labels
Jul 24, 2020
Relevant links from
|
This was referenced Jul 27, 2020
goodboy
added a commit
that referenced
this issue
Jul 30, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
added a commit
that referenced
this issue
Jul 30, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
added a commit
that referenced
this issue
Aug 4, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
added a commit
that referenced
this issue
Aug 4, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
added a commit
that referenced
this issue
Aug 9, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
added a commit
that referenced
this issue
Aug 13, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
Follow up from prompt-toolkit/python-prompt-toolkit#1204:
|
goodboy
added a commit
that referenced
this issue
Sep 24, 2020
This is the first step in addressing #113 and the initial support of #130. Basically this allows (sub)processes to engage the `pdbpp` debug machinery which read/writes the root actor's tty but only in a FIFO semaphored way such that no two processes are using it simultaneously. That means you can have multiple actors enter a trace or crash and run the debugger in a sensible way without clobbering each other's access to stdio. It required adding some "tear down hooks" to a custom `pdbpp.Pdb` type such that we release a child's lock on the parent on debugger exit (in this case when either of the "continue" or "quit" commands are issued to the debugger console). There's some code left commented in anticipation of full support for issue #130 where we're need to actually capture and feed stdin to the target (remote) actor which won't necessarily being running on the same host.
goodboy
changed the title
Fancy stuff (aka tab completion and editor controls) in a "remote debugger"
Fancy debugging (aka tab completion and editor controls) in a "remote debugger"
Sep 28, 2020
goodboy
changed the title
Fancy debugging (aka tab completion and editor controls) in a "remote debugger"
Fancy remote debugging (aka tab completion and editor controls)
Oct 5, 2021
goodboy
added a commit
that referenced
this issue
Mar 20, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Apr 11, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Apr 12, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Apr 17, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
2 tasks
goodboy
added a commit
that referenced
this issue
Sep 6, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Sep 15, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Sep 15, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Sep 15, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Sep 16, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Oct 6, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Oct 7, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Oct 12, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Oct 26, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Oct 26, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Dec 12, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Dec 13, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Dec 13, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Dec 13, 2022
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Jan 26, 2023
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Jan 26, 2023
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Jan 26, 2023
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
goodboy
added a commit
that referenced
this issue
Jan 30, 2023
This code is originally written (with much thanks) by @mikenerone:matrix.org. Adds a `tractor.trionics.ipython_embed()` which is `trio` compatible and allows straight up `await async_func()` calls in the REPL with expected default blocking semantics. More refinements to come including user config loading and eventually a foundation for what will be a console REPL + %magics for shipping work off to actor clusters and manual respawn controls and thus probably eventually obsoleting all the "parallel" stuff built into `ipython` B) Probably pertains to #130
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
debugger
discussion
enhancement
New feature or request
experiment
Exploratory design and testing
help wanted
Extra attention is needed
As part of the journey in #129 I discovered that no-one seems to have solved the problem of getting the fancy features of a modern Python debugger working in remote debugging applications. Though I haven't tested them all, the list in #113 seems to mostly contain systems which rely on telnet servers (or other network IPC magic) but none of them actually solve the issue of how to get the features in the local client that would normally only be possible when the remote process is connected to a tty.
The problem
Standard fancy (read human enhanced) debugger repls (including the stdlib's
pdb
which usesrlcompleter
, andpdb++
) rely on libraries such as GNU readline to get things like completion and CLI "editting controls". There seems to be no way to get these features with readline based systems in a remote debugging context since Python's use ofreadline
requires that the process is launched under a tty/pty system. Ideally these features are available in such use cases to make debugging of remote systems sane and efficient for the user.Evidence
In #129 I was able to verify that launching subprocesses with stdin as a unix pipe indeed causes no readline systems to be loaded. I haven't been able to find a remote debugger that supports this feature either (but of course hopefully someone will prove me wrong!).
Possible solutions
in the near term: we can not spawn with
stdin
as a pipe and instead let child processes stay connected to the parent tty (this is actually whattrip
does and it works):consider a debugger that doesn't use the stdlib's
readline
python-prompt-toolkit
which is specifically a replacement forreadline
ptk
can work without being connected to a tty/pty.ipdb
(sinceipython
usesptk
underneath) ifptk
was configured properly (it currently doesn't work any better then thereadline
options based on my testing in https://github.com/goodboy/tractor/tree/stin_char_relay).work with debugger's that want to move to
ptk
to get this functionality supported in their initial integration such as withpdbpp
in Edit debugger line with editor pdbpp/pdbpp#362Other notes
pytprocess
does get these features in subprocs as we'd expect but ideally we aren't spending time wrapping this withtrio
since it still won't work for the network remote debugging cases.stdin_relay_chars
branch.ptyprocess
from above).stty
for checking local tty settingsrlcompleter
notes thereadline
limitation for any doubters.epdb
's server mode (which seems to have the best integration withpdbpp
) has no notion of supporting this afaict.trio
on how to hook up subprocs for receiving input.Remote debugging possible hacks or solutions
python-remote-pdb
offers areadline
hack usingnc
orsocat
which may be usable in the near termptyprocess
has done fromtractor
spawning machinery and keeping compat with the publictrio
apiIdeally
ptk
to support all it's features without requiring a tty whatsoever and then being able to simply talk to atractor
actor running it.The text was updated successfully, but these errors were encountered: