We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DeepRNN的forward()中,使用了两个for loop:
for layer in range(self.num_layers): for step in range(seq_len): ...
意思是每层会把整个sequence都处理完,才传到下层。
但 pytorch 的文档显示,它们使用刚好相反的方式。 https://pytorch.org/docs/stable/generated/torch.nn.RNN.html
for step in range(seq_len): for layer in range(self.num_layers): ...
意思是每个sequence item都要先经过所有层,才接着处理下一个item。
也许结果差不多,但是意思差很远。到底这段参考d2l的代码有没问题?
The text was updated successfully, but these errors were encountered:
Hi,
对于单向多层RNN来说,两种写法应该是等价的:
此外,需要注意,如果是双向多层RNN,则只能使用我们的实现方式。
Sorry, something went wrong.
No branches or pull requests
DeepRNN的forward()中,使用了两个for loop:
for layer in range(self.num_layers):
for step in range(seq_len):
...
意思是每层会把整个sequence都处理完,才传到下层。
但 pytorch 的文档显示,它们使用刚好相反的方式。
https://pytorch.org/docs/stable/generated/torch.nn.RNN.html
for step in range(seq_len):
for layer in range(self.num_layers):
...
意思是每个sequence item都要先经过所有层,才接着处理下一个item。
也许结果差不多,但是意思差很远。到底这段参考d2l的代码有没问题?
The text was updated successfully, but these errors were encountered: