-
Traceback (most recent call last): torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB. GPU 0 has a total capacity of 6.00 GiB of which 0 bytes is free. Of the allocated memory 10.57 GiB is allocated by PyTorch, and 640.80 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables) this is my config:
Does anyone could point me out which parameters need to be adjusted? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Batch size and if you have some long audios |
Beta Was this translation helpful? Give feedback.
Batch size and if you have some long audios
max_audio_len