Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

用自己的训练预料loss降不下来 #16

Open
shizhediao opened this issue Apr 20, 2018 · 3 comments
Open

用自己的训练预料loss降不下来 #16

shizhediao opened this issue Apr 20, 2018 · 3 comments

Comments

@shizhediao
Copy link

我把训练预料换成了自己准备的,大概10-40w条对话
但是跑完10000个step之后,发现loss在4-5之间。。。
有没有什么办法
谢谢

@renhui2018
Copy link

我加了2W 条对话,loss 也处于4-5 之间,learning_rate已经为0 了

@Yuzeyang
Copy link

同样有这样的问题,loss降不下来,有解决的么? @warmheartli

@alige32
Copy link

alige32 commented Sep 19, 2018

增大LSTM神经元的size,提高词嵌入的维度。另外内存允许的话适当增加batch_size

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants