Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchNorm not training #196

Open
LeonardoHoltz opened this issue Nov 24, 2023 · 0 comments
Open

BatchNorm not training #196

LeonardoHoltz opened this issue Nov 24, 2023 · 0 comments

Comments

@LeonardoHoltz
Copy link

LeonardoHoltz commented Nov 24, 2023

I understand the customization of fixed modules defined in the configuration files and its purpose. But I did not understand the reason why the nn.BatchNorm1d modules are always in evaluation mode, specially in the function train() in softgroup.py. They always use the default values 1 and 0 for the scale and shift. Is there any reason why they are not being trained by default? If I remove the eval restriction of these modules, what type of changes can it cause to the segmentation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant