Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much memory about GPUs? #56

Open
focus1024-wind opened this issue Jan 15, 2023 · 3 comments
Open

How much memory about GPUs? #56

focus1024-wind opened this issue Jan 15, 2023 · 3 comments

Comments

@focus1024-wind
Copy link

When I train on my host, it issue the error of out of memory. My host is configured with 4 x 24GB NVIDA3090 GPUs. I see your file that say you use 8 A100 GPUs to train the task. But there are two configurations about A100, so i want to konw concrete configuration about you GPUs or how much memory should I need. Looking forward to your help, thanks.

@jqtangust
Copy link

I also used 3090 too, but it is still out of memory.

@Shengqi77
Copy link

I used 8 3090,but it is also out of memory.

@yyhtbs-yye
Copy link

It needs about 32GB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants