Skip to content

Commit

Permalink
Remove whitespace
Browse files Browse the repository at this point in the history
  • Loading branch information
akorgor committed Jun 29, 2024
1 parent 87fe3ad commit cd30f1d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion models/weight_optimizer.h
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ assumes :math:`\epsilon = \hat{\epsilon} \sqrt{ 1 - \beta_2^t }` to be constant.
When `optimize_each_step` is set to `True`, the weights are optimized at every
time step. If set to `False`, optimization occurs once per spike, resulting in a
significant speed-up. For gradient descent, both settings yield the same
results under exact arithmetic; however, small numerical differences may be
results under exact arithmetic; however, small numerical differences may be
observed due to floating point precision. For the Adam optimizer, only setting
`optimize_each_step` to `True` precisely implements the algorithm as described
in [2]_. The impact of this setting on learning performance may vary depending
Expand Down

0 comments on commit cd30f1d

Please sign in to comment.