-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Problem of exporting FP16 SyncBN model. #13976
Comments
Hey, this is the MXNet Label Bot. |
Hi @Fiend1213, Can you provide the rest of the info from the issue template to help us debug? Environment info (Required)
Package used (Python/R/Scala/Julia): Build info (Required if built from source)Compiler (gcc/clang/mingw/visual studio): |
Environment info (Required)
Build info (Required if built from source)
|
Thanks @Fiend1213, i tried commenting out this line:
and then change this line:
change it to this:
and the script completed. Obviously this means its not doing it in float16, but at least its succeeding. Im building from source using your build flags now, will try rerunning and debugging in to try see what the issue is. |
@mxnet-label-bot add [gluon] |
Hi @Fiend1213 The current implementation of Synchronized Batch Normalization (SyncBN) does not support FP16 training. Since you're use-case is just for inference, SyncBN has exactly the same behavior as BN during inference. Therefore, just replace SyncBN with regular nn.BatchNorm to resolve your problem. Please reply if this resolves your issue. |
Description
This is a problem when exporting fp16 model containing SyncBN.
Error Message:
The text was updated successfully, but these errors were encountered: