-
遇到报错了,torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 32.00 MiB (GPU 0; 2.00 GiB total capacity; 1.71 GiB already allocated; 0 bytes free; 1.72 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments 1 reply
-
至少6g显存能好用 |
Beta Was this translation helpful? Give feedback.
-
我的是:Nvidia GeForce GTX 1660 SUPER ( 6 GB / 七彩虹 ) |
Beta Was this translation helpful? Give feedback.
-
M40 才450 24G显存,可以入手,我的P40 1088 |
Beta Was this translation helpful? Give feedback.
-
2060s完美运行6b,RWKV模型内存16g基本上运行不了 |
Beta Was this translation helpful? Give feedback.
-
P40 24g,爽,打游戏用1060 6G
…------------------ 原始邮件 ------------------
发件人: "l15y/wenda" ***@***.***>;
发送时间: 2023年5月8日(星期一) 晚上11:14
***@***.***>;
***@***.******@***.***>;
主题: Re: [l15y/wenda] 大佬们都是用什么显卡啊 (Discussion #206)
2060s完美运行6b,RWKV模型内存16g基本上运行不了
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
48G内存 3060 6G 勉强可以跑一下。 |
Beta Was this translation helpful? Give feedback.
-
A6000 * 2 |
Beta Was this translation helpful? Give feedback.
至少6g显存能好用