Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the traing loss won't decrease for LeviHassner #35

Closed
haoyu09 opened this issue Jul 18, 2017 · 2 comments
Closed

the traing loss won't decrease for LeviHassner #35

haoyu09 opened this issue Jul 18, 2017 · 2 comments

Comments

@haoyu09
Copy link

haoyu09 commented Jul 18, 2017

Hi, I train just as README
$ python train.py --train_dir /home/dpressel/dev/work/AgeGenderDeepLearning/Folds/tf/gen_test_fold_is_0 --max_steps 10000
But the loss doesn't decrease at all. Is there something wrong with the method?

@dpressel
Copy link
Owner

dpressel commented Jul 18, 2017

The documentation on the page for running showed 10k steps on gender. This is not necessarily the recommended number of steps, and the learning rate was not specified in the example so it defaulted to the program defaults.

Since I wrote the original docs, I increased the learning rate default, and added a learning rate decay schedule, and set the number of default steps higher (these params worked well for age). I have updated the docs to explicitly lower it back like this --eta 0.001 for gender training (as it was before) and you should see the loss decreasing.

And you can try running with the defaults by removing the --max_steps or increasing it. The learning rate will then follow a staircase decay schedule.

The program allows you to control all of the hyperparameters.

@dpressel
Copy link
Owner

I have updated the docs to hopefully minimize confusion in the future

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants