-
Notifications
You must be signed in to change notification settings - Fork 498
Gazebo (gzclient) causes high CPU with an empty world #1560
Comments
Original comment by Chris Spencer (Bitbucket: chrisspen).
|
2 similar comments
Original comment by Chris Spencer (Bitbucket: chrisspen).
|
Original comment by Chris Spencer (Bitbucket: chrisspen).
|
Original comment by Nate Koenig (Bitbucket: Nathan Koenig). Do you have an integrated graphics card, or are you using software rendering? |
Original comment by Chris Spencer (Bitbucket: chrisspen). It's an Intel Macbook, so I believe it has an integrated graphics card with an Nvidia chipset. |
Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters). So you're running Ubuntu with dual-boot and not a virtual machine? |
Original comment by Chris Spencer (Bitbucket: chrisspen). Sort of. I'm running Ubuntu as the sole OS. |
Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters). May macbook pro has both intel and nvidia GPU's and it switches between them. I'm also able to see shadows without CPU spikes. If shadows cause your CPU to spike, I'm wondering if it's even using the GPU's at all? |
Original comment by Chris Spencer (Bitbucket: chrisspen). I tried purging all Nvidia drivers from my system and use the open source Nouveau driver instead, and that seems to have helped a little.
Now my CPU breakdown while running an idle gazebo looks like:
Unfortunately, now gzclient doesn't render anything. About 90% of the time, all I see is a black square where the scene should be. If I kill and restart it, occasionally it will render correctly, but this is rare. This happened a couple times while using the Nvidia driver, but it usually rendered fine, just with very high CPU usage. Is there any way to fix that without reverting to the Nvidia driver? Overall, OpenGL performance seems to be a lot better than with the Nvidia driver. Glxgears under Nvidia ran at 60 FPS, but now it runs at 800 FPS. |
Original comment by Chris Spencer (Bitbucket: chrisspen). Since it looks like this is probably a Gnome-Shell/Nvidia bug related to OpenGL, and not the fault of Gazebo, this can be closed. |
Original comment by Chris Spencer (Bitbucket: chrisspen).
|
Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters). Thanks for reporting this, even though we weren't able to help. |
Original comment by Chris Spencer (Bitbucket: chrisspen). Just upgraded to Ubuntu 14.04, which has a nouveau driver that implements OpenGL 3, and I'm back to seeing huge CPU usage from Gazebo, even when paused and with nothing loaded. So the common element seems to be OpenGL 3.*, and may not be an Nvidia-specific problem. Should this be reopened? |
Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters). I think it's good that you've posted the problem and are adding updates, but I won't be able to debug this any time soon. We can re-open if someone volunteers to look into it or submits a pull request. |
Original comment by Chris Spencer (Bitbucket: chrisspen).
|
Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero). Umm interesting. I've been able to reproduce the problem on my Trusty (14.04) system running gnome-shell under the Intel integrated driver and nvidia-331 using optirun. gzclient is the one behind this high CPU usage. I was able to see it in gazebo5, gazebo4 but not on gazebo3. We need to check if this is also happening on non gnome-shell systems. |
Original comment by Chris Spencer (Bitbucket: chrisspen). I just had to revert to the Nvidia drivers because the Nouveau drivers don't support suspend/resume, and the effect on performance is severe. With Nouveau, cpu is high but gnome-shell is still responsive, but with Nvidia, after a few seconds the entire UI is essentially frozen. And this is with Gazebo 5 on Ubuntu 14.04 using Nvidia-340. |
Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero). Chris, are you experiment the same effect using nvidia-331? Does the problem persists on a non gnome-shell desktop? With respect to Nouveau, I've never give it a try but I would expect not to work just out of the box. Let's focus on the Nvidia problem by now. Feel free to open a new issue for Nouveau if you are interested on that support. |
Original comment by Chris Spencer (Bitbucket: chrisspen). Yes, it seems to happen with all Nvidia drivers. And the problem seems equally bad in Unity. Running I agree about Nouveau. I tested it just to see if the problem effected drivers other than Nvidia. |
Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero).
|
Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero). We have found several problems / issues that lead to the use of 100% of the CPU on empty world. After observe the callgrind results, seems like most of the time used by gzclient belongs to sky generation. Work to disable sky creation when no sky is defined in the sdf is under issue #1579. The other issue is about the rendering refresh rate. If we don't control the rate, then I will expect that the computer do it as fast as it can, which probably triggers the CPU up to 100%. |
Original comment by Dirk Thomas (Bitbucket: Dirk Thomas, GitHub: dirk-thomas). I have the same problem on Xenial with a Lenovo P50 (Nvidia + Intel graphics). As soon as I start What we have tried so far:
|
Original comment by Nate Koenig (Bitbucket: Nathan Koenig).
|
Original comment by Nate Koenig (Bitbucket: Nathan Koenig). |
Original comment by Nate Koenig (Bitbucket: Nathan Koenig). |
Original comment by Dirk Thomas (Bitbucket: Dirk Thomas, GitHub: dirk-thomas). The referenced PR doesn't fix the problem for me. |
Original comment by Василий Самарин (Bitbucket: Aiven92). If you rotate scene camera to see ground plane from the bot (-z axis) - CPU load from gzclient become smaler. It work in VirtualBox |
Original comment by Matias N. (Bitbucket: v01d). I have the same problem, 100% cpu of gzclient and about 50% gzserver usage. This is on an Intel i7 laptop with an integrated Intel HD 4000 (using intel driver). This also happens to my in my other computer (i5, intel HD 3000). Are there any workarounds or settings I can test? I don't know how to proceed. |
Can confirm, Ubuntu 20 on KVM + virt-manager (within a Debian OS host) using Gazebo 11, on an Intel i7. An empty world stage drains the CPU, even with turning off the physics ODE. |
Original report (archived issue) by Chris Spencer (Bitbucket: chrisspen).
Running
gazebo
immediately causes nearly 100% CPU usage, even though I have no models loaded and have the simulation paused.Running
top
shows, on average, that the CPU usage breaks down like:Obviously, the big offender is gnome-shell, but as soon as I quit gazebo, gnome-shell CPU usage drops to 3%.
My system is a dual-core Macbook Pro running:
Another odd thing I've noticed, is if I set "shadows" to "false" under World->Scene, then CPU usage immediately drops, although is still a bit high. The break down is roughly:
However, the if I click away and then back to World->Scene, the "shadow" value is reset to "true".
This seems related to my graphics driver. If I install gazebo on another 12.04 machine, but running OpenGL 3.0 (Mesa 10.1.3), gazebo runs fine with almost no CPU usage.
The text was updated successfully, but these errors were encountered: