Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix shared parameter gradient scaling #283

Merged
merged 4 commits into from
Sep 26, 2021
Merged

Conversation

gasteigerjo
Copy link
Contributor

@gasteigerjo gasteigerjo commented Sep 22, 2021

Gradient scaling for shared parameters didn't work previous since it used trainer.model instead of trainer.model.module. Also, GemNet-T passed modules instead of parameters. This fixes these two issues.

Additionally, we now also show a warning if a shared parameter has no gradient, which usually means that it doesn't point to a PyTorch parameter.

@gasteigerjo gasteigerjo changed the title Fix shared parameter grad scaling Fix shared parameter gradient scaling Sep 22, 2021
Copy link
Collaborator

@abhshkdz abhshkdz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@abhshkdz abhshkdz merged commit 3e6a819 into master Sep 26, 2021
@abhshkdz abhshkdz deleted the fix_shared_parameters branch September 26, 2021 00:05
levineds pushed a commit that referenced this pull request Jul 11, 2024
* Fix shared parameter grad scaling

* Fix shared_parameters in GemNet

* Warn once if a shared parameter has no gradient

Co-authored-by: Abhishek Das <das.abhshk@gmail.com>
beomseok-kang pushed a commit to beomseok-kang/fairchem that referenced this pull request Jan 27, 2025
* Fix shared parameter grad scaling

* Fix shared_parameters in GemNet

* Warn once if a shared parameter has no gradient

Co-authored-by: Abhishek Das <das.abhshk@gmail.com>
Former-commit-id: 60bcc6178ea31c183f8533acc34919866a7511c7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants