You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello
Why we cant merge (gen_merged_model.py) all batchNorm on training -- all BN are constant
batch_norm_param {use_global_stats: true}
and some Scale on first 2 blocks (until conv3 block)
Additional questions:
-- Training are better with lr_mult: 0.1 on feature extraction?
-- Why BN on FC layers constant too ( use_global_stats: true ) maybe its better to adapt batch (256) with BN?
The text was updated successfully, but these errors were encountered:
Hello
Why we cant merge (gen_merged_model.py) all batchNorm on training -- all BN are constant
batch_norm_param {use_global_stats: true}
and some Scale on first 2 blocks (until conv3 block)
Additional questions:
-- Training are better with lr_mult: 0.1 on feature extraction?
-- Why BN on FC layers constant too ( use_global_stats: true ) maybe its better to adapt batch (256) with BN?
The text was updated successfully, but these errors were encountered: