Skip to content

Commit

Permalink
Fix eval accumulation when accelerate > 0.20.3 (huggingface#26060)
Browse files Browse the repository at this point in the history
As mentioned in: huggingface#25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
  • Loading branch information
sam-scale authored and blbadger committed Nov 8, 2023
1 parent ce547c9 commit 279791a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -3254,7 +3254,7 @@ def evaluation_loop(
if (
args.eval_accumulation_steps is not None
and (step + 1) % args.eval_accumulation_steps == 0
and self.accelerator.sync_gradients
and (self.accelerator.sync_gradients or version.parse(accelerate_version) > version.parse("0.20.3"))
):
if losses_host is not None:
losses = nested_numpify(losses_host)
Expand Down

0 comments on commit 279791a

Please sign in to comment.