Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate batch_size usage #373

Open
rg936672 opened this issue Aug 19, 2024 · 0 comments
Open

Investigate batch_size usage #373

rg936672 opened this issue Aug 19, 2024 · 0 comments

Comments

@rg936672
Copy link
Contributor

rg936672 commented Aug 19, 2024

Investigate the strange behaviour of batch_size with values close to the number of training points. With batch_size=num_training_points, training is faster than "unbatched" training, despite the fact that the full dataset is used. With batch_size just barely less than num_training_points, the model seems to make wildly inaccurate predictions.

Strangely, setting batch_size=500 makes the test take only 1.85s, and setting batch_size=499 makes the test fail as 55% of outputs are outside the confidence interval!
- @rg936672 in #354 (comment)

@rg936672 rg936672 mentioned this issue Aug 19, 2024
9 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant