You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Investigate the strange behaviour of batch_size with values close to the number of training points. With batch_size=num_training_points, training is faster than "unbatched" training, despite the fact that the full dataset is used. With batch_size just barely less than num_training_points, the model seems to make wildly inaccurate predictions.
Strangely, setting batch_size=500 makes the test take only 1.85s, and setting batch_size=499 makes the test fail as 55% of outputs are outside the confidence interval! - @rg936672 in #354 (comment)
The text was updated successfully, but these errors were encountered:
Investigate the strange behaviour of
batch_size
with values close to the number of training points. Withbatch_size
=num_training_points
, training is faster than "unbatched" training, despite the fact that the full dataset is used. Withbatch_size
just barely less thannum_training_points
, the model seems to make wildly inaccurate predictions.The text was updated successfully, but these errors were encountered: