Skip to content

Commit

Permalink
fix mask shape
Browse files Browse the repository at this point in the history
  • Loading branch information
stsouko committed Aug 23, 2024
1 parent 2869d98 commit b0454c3
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torchtune/modules/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -442,7 +442,7 @@ def forward(
)
# shape: [1, input_pos_len, m_s]
# in most cases input_pos_len should be 1
mask = self.causal_mask[None, input_pos]
mask = self.causal_mask[None, input_pos, :input_pos.max().item() + 1]

hidden = []
for i, layer in enumerate(self.layers):
Expand Down

0 comments on commit b0454c3

Please sign in to comment.