-
Notifications
You must be signed in to change notification settings - Fork 189
Issues: vacancy/Synchronized-BatchNorm-PyTorch
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
test gap between training and test
question
Further information is requested
#30
opened Aug 19, 2019 by
ZhiweiYan-96
After sync batch norm is applied, more gpu memory is consumed
enhancement
New feature or request
#10
opened Sep 28, 2018 by
shachoi
Wired things happen when applied to FPN
good first issue
Good for newcomers
#3
opened Apr 27, 2018 by
ShuLiu1993
Does this support torch.nn.parallel.DistributedDataParallel?
enhancement
New feature or request
#1
opened Apr 10, 2018 by
acgtyrant
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.