You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was running everything on the same machine (gputop server & ui and test-case), with up to date Ubuntu 16.04 and latest Mesa git version.
Test-case:
start gputop-ui
"Multi contexts" -> "RCS usage"
Start GpuTest v0.7 FurMark in windowed mode (I think any heavy 3D test would do)
Wait until "RCS usage" has stabilized
Drag with mouse the "OS visible sampling" slider from 7s to 2s
At step 3), GpuTop takes all GPU idle time, and other RCS percentages decrease as GPU ramps to full speed. The change in the shown percentages takes many seconds, although in reality the change happens instantly. I assume this view uses some kind of running overage, over the whole 7s. While this "animates" the transitions, it will hide faster changes. IMHO averaging should by default be over smaller period.
To see transitions easier, I changed sampling to 2s in step 5). After the animation, "RCS usage" showed 70% idle, although GPU is still 100% busy. This is obviously a bug.
The text was updated successfully, but these errors were encountered:
The RCS usage is computed based on the visible part of the timeline.
If you zoom on part of the timeline, the usage will be computed just for that zoom and by default it covers the 7s timeline.
It's not really obvious, although the usage window displays the amount of time used to compute the percentages.
Although the "animation" / slowness looks a bit odd, I don't care about it that much. But it showing GPU as being mostly idle when GPU remains 100% busy (also according to OA counter), is a bug.
I was running everything on the same machine (gputop server & ui and test-case), with up to date Ubuntu 16.04 and latest Mesa git version.
Test-case:
At step 3), GpuTop takes all GPU idle time, and other RCS percentages decrease as GPU ramps to full speed. The change in the shown percentages takes many seconds, although in reality the change happens instantly. I assume this view uses some kind of running overage, over the whole 7s. While this "animates" the transitions, it will hide faster changes. IMHO averaging should by default be over smaller period.
To see transitions easier, I changed sampling to 2s in step 5). After the animation, "RCS usage" showed 70% idle, although GPU is still 100% busy. This is obviously a bug.
The text was updated successfully, but these errors were encountered: