-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shell: Cat a large file (144MB, for example) and the shell app freezes while all the characters are being printed by JavaScript #585
Comments
➤ Jeremy Nicklas commented: Ticket is here: INC0313174 ( https://oarnet.service-now.com/nav_to.do?uri=incident.do?sys_id=b7ecae1813517600804831f18144b071 ) The slowdown is caused by the repeated printing of data to the screen. image ( https://cloud.githubusercontent.com/assets/4260509/23470289/05ef4182-fe74-11e6-8d90-9b4429366914.png ) One possible way to speed this up is to build an internal buffer before printing so that we call the print function less. |
➤ Jeremy Nicklas commented: Maybe look into throttling or debouncing. |
➤ Eric Franz commented: The two main problems are:
Options for further exploration.
FWIW we only have had one complaint about this since the inception of OnDemand, and it was with using app (was it Apache Spark?) where the commands resulted in outputting large amounts of data. |
➤ Jeremy Nicklas commented: Possibly look into an ack workflow to slow down the flood of data. Where we send an ack after we render the current data in order to receive more data. |
xterm.js has a solution for this:
The easiest solution might be to move to using xterm.js instead of hterm.
In the short term we should consider documenting this limitation somewhere. This problem seems to make it difficult or impossible when working with the Apache Spark shell via hterm because commands will often print out a lot of data.
┆Issue is synchronized with this Asana task by Unito
The text was updated successfully, but these errors were encountered: