We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hello. I'm trying to train Polyglot 12.8B model using h2o llm-studio, but I'm getting an OOM error.
The GPU I am using is Nvidia A5000 24GB 4 sheets, and the params I used are as follows zip file.
Is there any solution?
The text was updated successfully, but these errors were encountered:
There are multiple ways to bring down memory consumption. Mainly:
There is also an ongoing effort to split the model weights across multiple GPUs during training in this PR: #288
Sorry, something went wrong.
No branches or pull requests
🐛 Bug
hello. I'm trying to train Polyglot 12.8B model using h2o llm-studio, but I'm getting an OOM error.
The GPU I am using is Nvidia A5000 24GB 4 sheets, and the params I used are as follows zip file.
Uploading logs_polyglot-ko-12.8b_12.8b_koalpaca-v1.1a-row21155_v3_bs1_gas8.zip…
Is there any solution?
The text was updated successfully, but these errors were encountered: