We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug描述 3.7 3.9中计算交叉熵损失函数时使用tf.nn.CrossEntropyLoss()时,已经作了平均。 但是计算每一个epoch的损失时,又除了整个训练集样本的个数n ,这样对吗?
版本信息 pytorch:1.13 torchvision:0.4.2 torchtext:无 ...
The text was updated successfully, but these errors were encountered:
你说得对,应该除以batch数,因为tf.nn.CrossEntropyLoss()时已经沿batch维作了平均。
Sorry, something went wrong.
No branches or pull requests
bug描述
3.7 3.9中计算交叉熵损失函数时使用tf.nn.CrossEntropyLoss()时,已经作了平均。
但是计算每一个epoch的损失时,又除了整个训练集样本的个数n ,这样对吗?
版本信息
pytorch:1.13
torchvision:0.4.2
torchtext:无
...
The text was updated successfully, but these errors were encountered: