-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BatchNorm not training #196
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I understand the customization of fixed modules defined in the configuration files and its purpose. But I did not understand the reason why the
nn.BatchNorm1d
modules are always in evaluation mode, specially in the functiontrain()
insoftgroup.py
. They always use the default values 1 and 0 for the scale and shift. Is there any reason why they are not being trained by default? If I remove the eval restriction of these modules, what type of changes can it cause to the segmentation?The text was updated successfully, but these errors were encountered: