We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The batch size finder sets an unrealistically high batch size if all samples of the training dataset fit into one batch.
... Batch size 8388608 succeeded, trying batch size 16777216 Batch size 16777216 succeeded, trying batch size 33554432 Batch size 33554432 succeeded, trying batch size 67108864 Finished batch size finder, will continue with full run using batch size 67108864
Steps to reproduce the behavior:
auto_scale_batch_size=True
self.batch_size
Batch size search space should not be larger than number of available training samples.
The text was updated successfully, but these errors were encountered:
Hi! thanks for your contribution!, great first issue!
Sorry, something went wrong.
awaelchli
Successfully merging a pull request may close this issue.
🐛 Bug
The batch size finder sets an unrealistically high batch size if all samples of the training dataset fit into one batch.
To Reproduce
Steps to reproduce the behavior:
auto_scale_batch_size=True
(one needs to remove hardcoded batch size and setself.batch_size
).Expected behavior
Batch size search space should not be larger than number of available training samples.
The text was updated successfully, but these errors were encountered: