Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How many GPU memory does the neural network need? #10

Open
xiboli opened this issue Feb 13, 2022 · 4 comments
Open

How many GPU memory does the neural network need? #10

xiboli opened this issue Feb 13, 2022 · 4 comments

Comments

@xiboli
Copy link

xiboli commented Feb 13, 2022

Hello Luca,

I try to run the program in my own computer, which is with GTX 2060 Super GPU. However it seems my 8GB memory is not enough because I always face OOM when allocating tensor with shape[64,464,3,12].
How much memory of GPU is needed?

Best regards
Xibo

@xiboli xiboli changed the title How many GPU memory? How many GPU memory does the neural network need? Feb 13, 2022
@Luca96
Copy link
Owner

Luca96 commented Feb 14, 2022

Hi,
I don't know exactly how much GPU mem is needed but, unfortunately, is plenty.

What you can try is to edit main.py. In particular you could set timesteps=256 and/or batch_size=32 in the learning.stage_sX(...) function calls. You can also reduce the input size, here, or even the #parameters of the agent's neural-net: see here and here too.

Hope it helps

@xiboli
Copy link
Author

xiboli commented Feb 15, 2022

Thank you so much. I would try that. With Nvidia-smi I have saw that carla will use 4GB and the network should use more than 4 GB

@nguyenvantruong1905
Copy link

@xiboli hi

My Carla can use 4GB. I try

os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
os.environ["CUDA_VISIBLE_DEVICES"] = "0"

but network not using GPU. pls help me

@Luca96
Copy link
Owner

Luca96 commented Jun 10, 2023

@nguyenvantruong1905, check the following:

  • Do you set the environment variables before loading tensorflow, i.e. before import tensorflow as tf?
  • Do you have a CUDA capable GPU? If so, have you correctly installed CUDA, Cudnn, etc? Check the console for error messages.
  • The issue happens only with CARLA, or even with tensorflow alone (to check the latter run a script with only tensorflow and make sure the GPU allocates a tensor)?
  • Have you tried forcing GPU execution? You can do that by wrapping the code about the model with a with tf.device('gpu'): scope.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants