Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fixed bug where tuner would not tune lr if also tuning batch_size (#4688
) * fixed bug where tuner would not tune lr if also tuning batch_size * added a '+1' to computing the smoothed loss. This maintains the behavior for the smoothed loss as before the bug fix * pep8 fix * add changelog Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
- Loading branch information