You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I encounter with a problem when setting batchsize 16 on one 2080ti 11G causes OOM. So I have to put 8 on two GPUs respectively. But the paper says the experiment train only on one 1080ti, So I am so confused and don't know what I am missing. And also, while training, my GPU memory are full. But the use of GPU is 0%. Could you share me with some advices? Thank you very much.
The text was updated successfully, but these errors were encountered:
I encounter with a problem when setting batchsize 16 on one 2080ti 11G causes OOM. So I have to put 8 on two GPUs respectively. But the paper says the experiment train only on one 1080ti, So I am so confused and don't know what I am missing. And also, while training, my GPU memory are full. But the use of GPU is 0%. Could you share me with some advices? Thank you very much.
The text was updated successfully, but these errors were encountered: