Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

normalization modification #963

Merged
merged 4 commits into from
May 11, 2019
Merged

normalization modification #963

merged 4 commits into from
May 11, 2019

Conversation

yd-yin
Copy link
Contributor

@yd-yin yd-yin commented May 11, 2019

Checklist

  • I've tested that my changes are compatible with the latest version of Tensorflow.
  • I've read the Contribution Guidelines
  • I've updated the documentation if necessary.

Motivation and Context

BatchNorm:

(code262) mean, var = tf.nn.moments(inputs, self.axes)
When calculating the mean and variance of inputs, keepdims is set to False by default, returning a one-dimensional tensor. This tensor will be broadcasted along with the last dimension of inputs when doing batch_normalization, which is not suitable for channels first version.

InstanceNorm

The previous version of class InstanceNorm only supports InstanceNorm2d and static mode.

Description

Bug fixing of BatchNorm
A new version of InstanceNorm
Add *1d/2d/3d

yd-yin added 3 commits May 10, 2019 22:46
fix bugs of BatchNorm
rewrite InstanceNorm
Updata docs of normalization
update docstrings of InstanceNorm
@zsdonghao
Copy link
Member

remember to change the changelog.md

@zsdonghao
Copy link
Member

also, remember to correct the python format:

yapf -i xxx.py
# or for a folder
yapf -i --recursive folder

@zsdonghao zsdonghao merged commit 4d6cb5a into tensorlayer:master May 11, 2019
@yd-yin yd-yin deleted the bug_fix_instance_norm branch May 11, 2019 09:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants