We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我把训练预料换成了自己准备的,大概10-40w条对话 但是跑完10000个step之后,发现loss在4-5之间。。。 有没有什么办法 谢谢
The text was updated successfully, but these errors were encountered:
我加了2W 条对话,loss 也处于4-5 之间,learning_rate已经为0 了
Sorry, something went wrong.
同样有这样的问题,loss降不下来,有解决的么? @warmheartli
增大LSTM神经元的size,提高词嵌入的维度。另外内存允许的话适当增加batch_size
No branches or pull requests
我把训练预料换成了自己准备的,大概10-40w条对话
但是跑完10000个step之后,发现loss在4-5之间。。。
有没有什么办法
谢谢
The text was updated successfully, but these errors were encountered: