Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support optional flag to clamp gradient in 'backward' to prevent crash #48

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

daniel347x
Copy link
Contributor

Identical PR to #45 - but using a dedicated branch from my forked repository so I can continue making other changes in the fork.

This commit addresses a very intermittent, but deadly crash bug that is destroying my training runs - a very occasional infinite gradient in the 'backward' function.

In this commit, functionality remains unchanged by default.

However, an optional flag has been added that allows clamping the gradient in the 'backward' function. The flag takes the form of an int or sequence giving the max value (or min/max if sequence).

An optional third value in the passed sequence is interpreted as a Boolean that indicates whether to print a warning to the console whenever an infinite gradient is clamped. The default is False.

Support for PyTorch only.
…tions

Also, add dummy argument in 'backward' to match new backward_clamp_gradient_mag argument
@daniel347x daniel347x changed the title Pydiffvg Support optional flag to clamp gradient in 'backward' to prevent crash Nov 2, 2022
@BachiLi
Copy link
Owner

BachiLi commented Jan 9, 2023

Let me ponder a bit what is the best way to do this. If I can't figure out a better/easier way I'll merge this. : >

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants