Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update scnn for newer pytorch versions #11

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

stephandooper
Copy link
Contributor

later versions of pytorch have changed the torch.nn.grad backend. As a result, cpp files are no longer needed.

Question 1 : can this code be replaced with its original decorator @custom_fwd and @custom_bwd?

if '1.6' in torch.__version__: # type:ignore
    def forward_amp_decorator(func): 
        return torch.cuda.amp.custom_fwd(func)  # type:ignore
    def backward_amp_decorator(func): 
        return torch.cuda.amp.custom_bwd(func)  # type:ignore
    from torch.cuda.amp import autocast
else:
    def forward_amp_decorator(func): 
        return func
    def backward_amp_decorator(func):
        return func

Question 2: it seems that gather_gradient is now a dead end, ending at self.gradient being an empty dictionary and not being referenced anywhere. can it be removed?

self.gradients = {}

later versions of pytorch have changed the torch.nn.grad backend. As a result, cpp files are no longer needed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant