Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logs are lost #2

Open
goldentimecoolk opened this issue Aug 5, 2019 · 7 comments
Open

logs are lost #2

goldentimecoolk opened this issue Aug 5, 2019 · 7 comments

Comments

@goldentimecoolk
Copy link

Hi, thanks for sharing your work. But logs that record the process corresponding to results reported in paper are lost. Can you please share them again?

@prlz77
Copy link
Owner

prlz77 commented Aug 5, 2019

Hi, thanks for pointing out the issue, I am looking for the logs to put them again. If I have to rerun the code it may take more time.

@goldentimecoolk
Copy link
Author

Yea, I agree. It would be better to find out the logs. I'm looking forward to it. -:)

@prlz77
Copy link
Owner

prlz77 commented Aug 5, 2019

I have uploaded logs on CIFAR, now I am uploading the logs on ImageNet

@prlz77
Copy link
Owner

prlz77 commented Aug 5, 2019

Done

@goldentimecoolk
Copy link
Author

goldentimecoolk commented Aug 6, 2019

Hi, thanks for your quick action. I trained with origional code and default setting in the .sh file. But only get 79% top1. In the paper, this seems to be more than 81%. Then I trained attention version with default hyper parameters, and got top1 acc no more than 80%. Do you have experience in this situation?

@prlz77
Copy link
Owner

prlz77 commented Aug 6, 2019

Hi, the results I report are the median of five runs, so maybe if you run more times you will find the correct accuracy. It is also possible that the baselines were run with different number of gpus/batch size, but the sh script was made to be run with minimal resources.

@prlz77
Copy link
Owner

prlz77 commented Aug 6, 2019

Ah, maybe you should add 0.3 dropout. The baseline in the sh is the no_dropout baseline I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants