Skip to content
This repository has been archived by the owner on Feb 3, 2025. It is now read-only.

Gazebo (gzclient) causes high CPU with an empty world #1560

Open
osrf-migration opened this issue Apr 9, 2015 · 29 comments
Open

Gazebo (gzclient) causes high CPU with an empty world #1560

osrf-migration opened this issue Apr 9, 2015 · 29 comments
Labels
3.0 bug Something isn't working common major

Comments

@osrf-migration
Copy link

Original report (archived issue) by Chris Spencer (Bitbucket: chrisspen).


Running gazebo immediately causes nearly 100% CPU usage, even though I have no models loaded and have the simulation paused.

Running top shows, on average, that the CPU usage breaks down like:

80% /usr/bin/gnome-shell
60% gzclient
20% Xorg
10% gzserver

Obviously, the big offender is gnome-shell, but as soon as I quit gazebo, gnome-shell CPU usage drops to 3%.

My system is a dual-core Macbook Pro running:

Ubuntu 12.04
OpenGL 3.3.0 (NVIDIA 331.113 driver)
Gazebo 4.1.1 (installed from the "Step-by-step" instructions at http://gazebosim.org/tutorials?tut=install_ubuntu&cat=installation)

Another odd thing I've noticed, is if I set "shadows" to "false" under World->Scene, then CPU usage immediately drops, although is still a bit high. The break down is roughly:

20% gnome-shell
20% gzclient
10% gzserver

However, the if I click away and then back to World->Scene, the "shadow" value is reset to "true".

This seems related to my graphics driver. If I install gazebo on another 12.04 machine, but running OpenGL 3.0 (Mesa 10.1.3), gazebo runs fine with almost no CPU usage.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


  • Edited issue description

2 similar comments
@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


  • Edited issue description

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


  • Edited issue description

@osrf-migration
Copy link
Author

Original comment by Nate Koenig (Bitbucket: Nathan Koenig).


Do you have an integrated graphics card, or are you using software rendering?

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


It's an Intel Macbook, so I believe it has an integrated graphics card with an Nvidia chipset.

@osrf-migration
Copy link
Author

Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters).


So you're running Ubuntu with dual-boot and not a virtual machine?

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


Sort of. I'm running Ubuntu as the sole OS.

@osrf-migration
Copy link
Author

Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters).


May macbook pro has both intel and nvidia GPU's and it switches between them. I'm also able to see shadows without CPU spikes. If shadows cause your CPU to spike, I'm wondering if it's even using the GPU's at all?

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


I tried purging all Nvidia drivers from my system and use the open source Nouveau driver instead, and that seems to have helped a little.

glxinfo reports I now have OpenGL 2.1 Mesa 8.0.4, so it's older than Nvidia, but CPU usage is way down.

Now my CPU breakdown while running an idle gazebo looks like:

15% gzclient
10% gzserver
3% gnome-shell
1% Xorg

Unfortunately, now gzclient doesn't render anything.

About 90% of the time, all I see is a black square where the scene should be. If I kill and restart it, occasionally it will render correctly, but this is rare. This happened a couple times while using the Nvidia driver, but it usually rendered fine, just with very high CPU usage.

Is there any way to fix that without reverting to the Nvidia driver?

Overall, OpenGL performance seems to be a lot better than with the Nvidia driver. Glxgears under Nvidia ran at 60 FPS, but now it runs at 800 FPS.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


Since it looks like this is probably a Gnome-Shell/Nvidia bug related to OpenGL, and not the fault of Gazebo, this can be closed.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


  • changed state from "new" to "wontfix"

@osrf-migration
Copy link
Author

Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters).


Thanks for reporting this, even though we weren't able to help.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


Just upgraded to Ubuntu 14.04, which has a nouveau driver that implements OpenGL 3, and I'm back to seeing huge CPU usage from Gazebo, even when paused and with nothing loaded.

So the common element seems to be OpenGL 3.*, and may not be an Nvidia-specific problem. Should this be reopened?

@osrf-migration
Copy link
Author

Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters).


I think it's good that you've posted the problem and are adding updates, but I won't be able to debug this any time soon. We can re-open if someone volunteers to look into it or submits a pull request.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


  • changed state from "wontfix" to "open"

@osrf-migration
Copy link
Author

Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero).


Umm interesting.

I've been able to reproduce the problem on my Trusty (14.04) system running gnome-shell under the Intel integrated driver and nvidia-331 using optirun. gzclient is the one behind this high CPU usage.

I was able to see it in gazebo5, gazebo4 but not on gazebo3. We need to check if this is also happening on non gnome-shell systems.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


I just had to revert to the Nvidia drivers because the Nouveau drivers don't support suspend/resume, and the effect on performance is severe. With Nouveau, cpu is high but gnome-shell is still responsive, but with Nvidia, after a few seconds the entire UI is essentially frozen.

And this is with Gazebo 5 on Ubuntu 14.04 using Nvidia-340.

@osrf-migration
Copy link
Author

Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero).


Chris, are you experiment the same effect using nvidia-331? Does the problem persists on a non gnome-shell desktop?

With respect to Nouveau, I've never give it a try but I would expect not to work just out of the box. Let's focus on the Nvidia problem by now. Feel free to open a new issue for Nouveau if you are interested on that support.

@osrf-migration
Copy link
Author

Original comment by Chris Spencer (Bitbucket: chrisspen).


Yes, it seems to happen with all Nvidia drivers.

And the problem seems equally bad in Unity. Running gazebo --pause shows gzclient consuming 100% of cpu.

I agree about Nouveau. I tested it just to see if the problem effected drivers other than Nvidia.

@osrf-migration
Copy link
Author

Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero).


  • changed title from "Gazebo causes high CPU" to "Gazebo (gzclient) causes high CPU with an empty world"

@osrf-migration
Copy link
Author

Original comment by Jose Luis Rivero (Bitbucket: Jose Luis Rivero, GitHub: j-rivero).


We have found several problems / issues that lead to the use of 100% of the CPU on empty world.

After observe the callgrind results, seems like most of the time used by gzclient belongs to sky generation. Work to disable sky creation when no sky is defined in the sdf is under issue #1579.

The other issue is about the rendering refresh rate. If we don't control the rate, then I will expect that the computer do it as fast as it can, which probably triggers the CPU up to 100%.

@osrf-migration
Copy link
Author

Original comment by Dirk Thomas (Bitbucket: Dirk Thomas, GitHub: dirk-thomas).


I have the same problem on Xenial with a Lenovo P50 (Nvidia + Intel graphics). As soon as I start gzclient the CPU is loaded and the UI only shows a few fps. This happens with the default empty world (which has no sky and no ground texture.

What we have tried so far:

  • updated to the latest Debian packages from packages.osrfoundation.org (7.3.1)
  • Nvidia driver is the same as on a working system
  • glxgears runs flawless
  • when not running gzclient on my machine but remotely on another machine the UI is rending normally with normal load on my machine

@osrf-migration
Copy link
Author

Original comment by Nate Koenig (Bitbucket: Nathan Koenig).


  • changed state from "open" to "new"

@osrf-migration
Copy link
Author

Original comment by Nate Koenig (Bitbucket: Nathan Koenig).


  • set assignee_account_id to "557058:095b1e12-74ed-4e20-b44f-2f0745b616e0"
  • set assignee to "nkoenig (Bitbucket: nkoenig, GitHub: nkoenig)"

@osrf-migration
Copy link
Author

Original comment by Nate Koenig (Bitbucket: Nathan Koenig).


See pull request #2476

@osrf-migration
Copy link
Author

Original comment by Dirk Thomas (Bitbucket: Dirk Thomas, GitHub: dirk-thomas).


The referenced PR doesn't fix the problem for me.

@osrf-migration
Copy link
Author

Original comment by Василий Самарин (Bitbucket: Aiven92).


If you rotate scene camera to see ground plane from the bot (-z axis) - CPU load from gzclient become smaler. It work in VirtualBox

@osrf-migration
Copy link
Author

Original comment by Matias N. (Bitbucket: v01d).


I have the same problem, 100% cpu of gzclient and about 50% gzserver usage. This is on an Intel i7 laptop with an integrated Intel HD 4000 (using intel driver).
I'm on Elementary OS Loki, which is equivalent to Ubuntu Xenial, besides the different desktop shell. This is with Gazebo 5.

This also happens to my in my other computer (i5, intel HD 3000).

Are there any workarounds or settings I can test? I don't know how to proceed.

@osrf-migration osrf-migration added bug Something isn't working 3.0 labels Apr 19, 2020
@freckletonj
Copy link

Can confirm, Ubuntu 20 on KVM + virt-manager (within a Debian OS host) using Gazebo 11, on an Intel i7. An empty world stage drains the CPU, even with turning off the physics ODE.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
3.0 bug Something isn't working common major
Projects
None yet
Development

No branches or pull requests

2 participants