Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Epoch size #92

Open
halameri opened this issue Aug 13, 2020 · 3 comments
Open

Epoch size #92

halameri opened this issue Aug 13, 2020 · 3 comments

Comments

@halameri
Copy link

How can I modify The iteration or epoch size to reduce the training time

@spagliarini
Copy link

Hi!
Do you mean how to save more frequently?

If so, you can change the parameter
train_save_secs

If you change this parameter, and you still want your loss being saved and plot ~at the same time (when you visualize it on Tensorboard) you need to change also the parameter
train_summary_secs
such that train_save_secs=train_summary_secs.

@halameri
Copy link
Author

thank you for your response @spagliarini
I don't want save more frequently ,I want to reduce the epochs or iterations size because the training process take a long time (200k iterations)

@spagliarini
Copy link

I see. Then, until now the only way to stop the training I found is manual. So you just need to stop it earlier than 200k iterations. Just make sure that the generator is performing well enough by checking the preview. In #63 it was mentioned that good results were already obtained after 100k iterations, or earlier.

Actually, this is the first time that I deal with Tensorflow and for what I found in the Tensorflow documentation it is possible to automatically stop the training session fixing a threshold for the loss. But I couldn't find a good one. Are you more familiar with Tensorflow? Is there a way to stop the training that is based on the number of iterations?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants