Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance of refresh is still subpar #134

Closed
Tyriar opened this issue Jun 16, 2016 · 7 comments
Closed

Performance of refresh is still subpar #134

Tyriar opened this issue Jun 16, 2016 · 7 comments

Comments

@Tyriar
Copy link
Member

Tyriar commented Jun 16, 2016

Follow up from #131 (comment)

It looks like refresh and the layout is still taking far too long in a real application scenario. Consider the following portion of a Chromium timeline when running ls -lR /usr/share in VS Code, which takes around 3.5s for the process to run but ~25s until it's finished in the UI.

image

My initial thoughts on this were that it was taking too long to parse the incoming data in Terminal.prototype.write but this timeline seems to refute that. All time is spent in refresh and functions/events it triggers.

Here is the timeline for a single refresh:

image

And zoomed in even more to the portion of refresh that gets repeated:

image

The basic flow of the section that repeats is:

  1. Blur/focusout events
  2. Recalc style
  3. Lots of separate parse HTML sections
  4. Recalc style
  5. Layout
  6. Focus/focusin events
  7. Recalc style
  8. Layout

Looking at this, refresh's performance could still certainly be improved. In the meantime another thing that would be good to have to protect against this is to skip refreshes if a refresh is already in progress.

@Tyriar
Copy link
Member Author

Tyriar commented Jun 16, 2016

@parisk I can have a go at fixing this considering I have the most context.

@parisk
Copy link
Contributor

parisk commented Jun 17, 2016

@Tyriar I think that we are already skipping such refreshes in a6e85ad#diff-d4f745d6f29faa0617f2e02bb95f0c44R1206. Do you mean something else?

Feel free to give it a try if you would like. What is the approach you are thinking to take?

@Tyriar
Copy link
Member Author

Tyriar commented Jun 17, 2016

@parisk using a DocumentFragment to prevent layout thrashing. I believe right now if a refresh takes longer than 34ms, it may not wait before starting a new one, ie. this._refreshIsQueued = false; should be at the end of refresh.

@jerch
Copy link
Member

jerch commented Jun 17, 2016

I had the same problem with https://github.com/netzkolchose/jquery.browserterminal and played alot with performance tips floating around. What helped to increase the output performance and interactivity was:

  • dont use divs, instead use pre as terminal container with spans, pre has a much lesser layouting penalty
  • always do late reflowing (e.g. use fragments and no height/width calculation in between)
  • cache everything that was not changed (fragment cache)
  • split big input data into chunks (like 32k bytes) and move the chunk work to the event system to keep the terminal snappy, trigger output refresh only at the end of those chunks

@jerch
Copy link
Member

jerch commented Jun 20, 2016

Slightly offtopic:
The chunk splitting can only partly solve the problem with CTRL+C getting recognized by a output heavy task as it would give the keystrokes a chance to get sent to the PTY between those chunks.
Main problem with CTRL+C and heavy output task is the buffer situation and in particular the way pty.js is implemented. pty.js tries to be non blocking with the usual callback/event paradigm of node while the PTY system is normally a blocking kernel device interface. The latter is the reason why other terminal emulators can react pretty fast - as soon as some buffer in the chain is full the whole pipe will just halt until the emulator has eaten more bytes (The halt includes the running program as well, that is the reason for for blocking STDIN/STDOUT in the first place). The internal buffers sizes are like 4096 (up to 64k), therefore the emulator seems to react instantly. pty.js instead will just grow its internal read buffer and propagate more and more bytes to the next callback invocation. (This can lead to out of sync problems between the terminal and the running program.)
Since the PTY system is based on serial line abstraction maybe XOFF/XON or the baud rate setting can help to circumvent the problem. Another approach would be to change pty.js into blocking mode, but this needs a proper abstraction on the node/JS side to not break with the nonblocking callback paradigm.

@jerch
Copy link
Member

jerch commented Jun 21, 2016

XOFF/XON solves the CTRL+C plus big input data problem as you can try with this patch: https://gist.github.com/jerch/4393f100562b602ee1cc030cc9fa1f63
The slave program is paused while the data gets processed by the terminal. Therefore it runs much slower, but at least almost in sync with the terminal (speed of the terminal is the bottleneck, well this can be enhanced)

@Tyriar
Copy link
Member Author

Tyriar commented Dec 31, 2016

This no longer appears to be happening, see #290, #150 and #280 for further refresh perf investigations.

@Tyriar Tyriar closed this as completed Dec 31, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants