Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix eval accumulation when accelerate > 0.20.3 #26060

Merged
merged 1 commit into from
Sep 14, 2023

Conversation

sam-scale
Copy link
Contributor

As mentioned in: #25641

Eval accumulation will never happen with accelerate > 0.20.3, so this change ensures that sync_gradients is ignored if accelerate is > 0.20.3

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

As mentioned in: huggingface#25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this fix!

Looks OK to me cc @muellerzr for a second opinion before merging

@sam-scale
Copy link
Contributor Author

@muellerzr any thoughts :)

Copy link
Contributor

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Solution looks good to me as well. Thanks for the fix!

@amyeroberts amyeroberts merged commit 6d49b9d into huggingface:main Sep 14, 2023
@sam-scale
Copy link
Contributor Author

Thanks! @amyeroberts @muellerzr is there any way to track when this fix will get added to a release?

parambharat pushed a commit to parambharat/transformers that referenced this pull request Sep 26, 2023
As mentioned in: huggingface#25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
blbadger pushed a commit to blbadger/transformers that referenced this pull request Nov 8, 2023
As mentioned in: huggingface#25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 18, 2023
As mentioned in: huggingface#25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants