We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When inference stage2,raise a CUDA out of memory error! I use 4090 and the CUDA memory is about 24G.
The text was updated successfully, but these errors were encountered:
The page says that inference requires only 10G of VRAM, but in actual testing, the second_stage requires 28G of VRAM😫
Sorry, something went wrong.
How much better the 2nd stage is compared to the 1st?
When I training the first stage, a single RTX 4090 also raised CUDA out of memory
No branches or pull requests
When inference stage2,raise a CUDA out of memory error! I use 4090 and the CUDA memory is about 24G.
The text was updated successfully, but these errors were encountered: