Skip to content

Commit

Permalink
Update seq2seq.md
Browse files Browse the repository at this point in the history
根据新版d2l中class EncoderDecoder的定义[1],其返回值只有一个,因此建议将“_”删除,否则会报错:ValueError: too many values to unpack (expected 2)
[1] https://github.com/d2l-ai/d2l-en/blob/master/d2l/torch.py#L951
  • Loading branch information
XihWang authored Dec 29, 2024
1 parent e6b18cc commit 98d12c1
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions chapter_recurrent-modern/seq2seq.md
Original file line number Diff line number Diff line change
Expand Up @@ -724,7 +724,7 @@ def train_seq2seq(net, data_iter, lr, num_epochs, tgt_vocab, device):
ctx=device).reshape(-1, 1)
dec_input = np.concatenate([bos, Y[:, :-1]], 1) # 强制教学
with autograd.record():
Y_hat, _ = net(X, dec_input, X_valid_len)
Y_hat = net(X, dec_input, X_valid_len)
l = loss(Y_hat, Y, Y_valid_len)
l.backward()
d2l.grad_clipping(net, 1)
Expand Down Expand Up @@ -766,7 +766,7 @@ def train_seq2seq(net, data_iter, lr, num_epochs, tgt_vocab, device):
bos = torch.tensor([tgt_vocab['<bos>']] * Y.shape[0],
device=device).reshape(-1, 1)
dec_input = torch.cat([bos, Y[:, :-1]], 1) # 强制教学
Y_hat, _ = net(X, dec_input, X_valid_len)
Y_hat = net(X, dec_input, X_valid_len)
l = loss(Y_hat, Y, Y_valid_len)
l.sum().backward() # 损失函数的标量进行“反向传播”
d2l.grad_clipping(net, 1)
Expand Down Expand Up @@ -797,7 +797,7 @@ def train_seq2seq(net, data_iter, lr, num_epochs, tgt_vocab, device):
shape=(-1, 1))
dec_input = tf.concat([bos, Y[:, :-1]], 1) # 强制教学
with tf.GradientTape() as tape:
Y_hat, _ = net(X, dec_input, X_valid_len, training=True)
Y_hat = net(X, dec_input, X_valid_len, training=True)
l = MaskedSoftmaxCELoss(Y_valid_len)(Y, Y_hat)
gradients = tape.gradient(l, net.trainable_variables)
gradients = d2l.grad_clipping(gradients, 1)
Expand Down Expand Up @@ -1120,4 +1120,4 @@ for eng, fra in zip(engs, fras):

:begin_tab:`paddle`
[Discussions](https://discuss.d2l.ai/t/11838)
:end_tab:
:end_tab:

0 comments on commit 98d12c1

Please sign in to comment.