Skip to content

Commit

Permalink
Fix require_grad typo (#1771)
Browse files Browse the repository at this point in the history
Summary:
Fix require_grad typos (should be requires_grad).
Before the fix, the code doesn't cause any errors but doesn't do what it's supposed to do.

Fixed with TorchFix https://github.com/pytorch/test-infra/tree/main/tools/torchfix

Upstream PR: codertimo/BERT-pytorch#104

Pull Request resolved: #1771

Reviewed By: xuzhao9

Differential Revision: D47531187

Pulled By: kit1980

fbshipit-source-id: 738b1866cc5cd3fedfa878cc40827236717f6f27
  • Loading branch information
kit1980 authored and facebook-github-bot committed Jul 19, 2023
1 parent e8c1cf0 commit ae8f06f
Showing 1 changed file with 2 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ def __init__(self, d_model, max_len=512):

# Compute the positional encodings once in log space.
pe = torch.zeros(max_len, d_model).float()
pe.require_grad = False
# Changed from upstream, see https://github.com/codertimo/BERT-pytorch/pull/104
pe.requires_grad = False

position = torch.arange(0, max_len).float().unsqueeze(1)
div_term = (torch.arange(0, d_model, 2).float() * -(math.log(10000.0) / d_model)).exp()
Expand Down

0 comments on commit ae8f06f

Please sign in to comment.