-
-
Notifications
You must be signed in to change notification settings - Fork 590
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(Is #173 accurate?) Changing refresh-rate seems to have an effect on input lag with experimental-backends enabled #588
Comments
Hmm, after taking a quick look through the code (using ripgrep + some manual file viewing in VSCode), it appears that refresh-rate only matters if sw-opti is turned on, which it clearly isn't here (unless Manjaro's default installation ships with other settings, being enabled somewhere). If this is correct, I'm not quite sure what/where the improved input lag came from, and whether it was just illusory or not. Unfortunately, I don't have a measuring device to test objectively. Though, if it was real and didn't come from changing refresh-rate, could it possibly had anything to do with picom restarting every time the config is changed? I'm clearly not familiar enough with the codebase to tell if any of this is true or not; I'm actually more confused now as to what this has to do with the experimental backends (unless sw-opti being deprecated is related to the new backends; I'm not sure as I haven't checked that) |
OK, I'm almost certain that what I experienced earlier was either an illusion or caused by picom resetting upon changing the settings (which somehow affected input lag) - my unfamiliarity with the code base is the only thing holding me back from saying this too absolutely. This can be closed once confirmed (and I can open a separate issue if the below is worth exploring) Went down a rabbit hole trying to figure out if there were other solutions to input lag; I didn't notice until looking more closely at the code and commits that glFinish() is already temporarily being used for NVIDIA since April. After some searching, I discovered these related issues from KDE [1], [2] and tried implementing the same change from [1] into picom, since the KDE removed usleep for allegedly better performance using a cap of 1 on __GL_MaxFramesAllowed. In my patched @yshui Just wondering if you've tried setting __GL_MaxFramesAllowed as an alternative fix? I don't assume it's much, much better anymore now, with regards to input lag, though I could do some longer testing (even though results would admittedly be quite subjective). Personally, after going through that rabbit hole, I've switched over to using ForceFullCompositionPipeline via the NVIDIA driver for now and disabling EDIT: I went back to trying my patched version again, and plan to use it for a bit (since setting ForceFullCompositionPipeline seems to have the added complication of doing it manually every time per launch; allegedly, there are some issues when doing it in an Xorg conf file). Though, theoretically-speaking (as I understand it now) apparently a lower value for MaxFramesAllowed should give better input lag? Also, I suppose I should note that I've been using a very subjective test (and possibly a terrible idea to begin with) of typing fast to measure input lag - so it's very difficult for me to tell if it's truly picom input lag, application lag, or even keyboard lag. |
Closed. I think this was mostly a fluke - though perhaps I did actually notice an improvement in input lag, but caused by restarting picom or something related to the USLEEP flag. Started a new issue to discuss using __GL_MaxFramesAllowed and other possible improvements to frame timing and input lag. |
Platform
Manjaro Linux x86_64 (Kernel 4.19.167-1-MANJARO)
GPU, drivers, and screen setup
NVIDIA TU104 (GeForce RTX 2080 Rev. A), driver version 4.60.32.03
Environment
i3 gaps
picom version
Configuration:
Steps of reproduction
picom --experimental-backends
refresh-rate
from 60 to 120 and then to 180 (preferably on a 60Hz monitor)refresh-rate
settings.Expected behavior
According to #173, the
refresh-rate
option should be ignored.Current Behavior
Changing
refresh-rate
seems to affect the input latency, as in it is markedly improved (for NVIDIA) after switching from 60 to higher settings of 180. (I'm currently using 180 to type this out, and it is much, much better experience compared to frequent typos/lags caused by my typing speed apparently being faster than the latency apparently introduced by picom)Some other notes
The TL;DR is that I'm wondering whether #173 is accurate or not, as it seems there is a noticeable change in input lag - at least to my subjective eyes; though it's also possible I'm not running picom correctly; it should be started up by i3 when the config is read:
According to #173, it seems that
refresh-rate
is slated for removal soon. I'm not sure ifrefresh-rate
affecting input latency is related to a change that's occurred in the code in the time that has past since #173 or that the option still affects the code in some ways (but not in other already-removed ways).I asked in #543 about improvements for NVIDIA drivers using experimental backends mainly due to this issue of input lag (it was pretty unbearable, but so was tearing without vsync turned on for picom), and it seems like there are still no improvements slated to be implemented soon for NVIDIA (other than #147, which now seems to have been pushed back?).
While I would be happy to see this option removed for the sake of improving
picom
, I would be very concerned if what remains ofrefresh-rate
is removed before input lag changes are made, since this is the main 'improvement' that's madepicom
much more usable to me, right now.The text was updated successfully, but these errors were encountered: