-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Conversation
Could you fix lint? The errors can be obtained by running |
Output shape: | ||
Same shape as input. | ||
|
||
This implementation is based on paper: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix format. see dropout
python/mxnet/gluon/nn/conv_layers.py
Outdated
|
||
Parameters | ||
---------- | ||
pad_width int: the size of the padding. If is int, uses the same |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix format
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if not int what else can it be?
python/mxnet/gluon/nn/conv_layers.py
Outdated
>>> input = mx.nd.random_normal(shape=(16, 3, 224, 224)) | ||
>>> output = m(input) | ||
""" | ||
def __init__(self, pad_width=0, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pad_width -> padding
python/mxnet/gluon/nn/conv_layers.py
Outdated
pad_width int: the size of the padding. If is int, uses the same | ||
padding in all boundaries. | ||
|
||
Shape: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix format
python/mxnet/gluon/nn/conv_layers.py
Outdated
Examples | ||
-------- | ||
>>> m = nn.ReflectionPad(3) | ||
>>> input = mx.nd.random_normal(shape=(16, 3, 224, 224)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mx.nd.random.normal
python/mxnet/gluon/nn/conv_layers.py
Outdated
|
||
Examples | ||
-------- | ||
>>> m = nn.ReflectionPad(3) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
m -> layer
ReflectionPad -> ReflectionPad2D
python/mxnet/gluon/nn/conv_layers.py
Outdated
self._pad_width = pad_width | ||
|
||
def forward(self, x): | ||
return F.pad(x, mode='reflect', pad_width=self._pad_width) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't pad_width be a tuple?
Please add tests. |
@szha The original author seems to have disappeared. Would you like to take over? |
Actually @zhanghang1989 will join us soon 😄 |
@piiswrong @szha Hi guys, sorry for the delay. I just made some changes. |
see the other ops for doc format. Also please add test cases |
Thanks @piiswrong! I made some changes to the doc. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there no unit test?
I will add one unit test soon at unittest/test_gluon.py . Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to add the InstanceNorm class in __all__
here https://github.com/apache/incubator-mxnet/pull/7938/files#diff-06285baf97b6456dd23bdb99f6e239b6R23
please fix lint |
python/mxnet/gluon/nn/conv_layers.py
Outdated
self._padding = padding | ||
|
||
def hybrid_forward(self, F, x): | ||
return F.pad(x, mode='reflect', padding=self._padding) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This isn't tested yet. Also, it looks like a very straightforward one-liner. Should we just ask users to use hybrid lambda for this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This simple API makes user easier to use. The Reflectance padding is popularly used in low level vision, such as GANs, super resolution and style transfer. I think it make sense to provide an API for it. What's your thought?
http://pytorch.org/docs/0.3.0/nn.html?highlight=padding#reflectionpad2d
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should have this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
python/mxnet/gluon/nn/conv_layers.py
Outdated
|
||
Parameters | ||
---------- | ||
`padding` is a tuple of integer padding widths for each axis of the format |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please fix the documentation and add tests
python/mxnet/gluon/nn/conv_layers.py
Outdated
""" | ||
def __init__(self, padding=0, **kwargs): | ||
super(ReflectionPad2D, self).__init__(**kwargs) | ||
self._padding = padding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure this works correctly when padding is a number instead of a tuple?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just updated the integer type check and fix docs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the classes to docs/api/python/gluon/nn.md so that doc is available.
super(InstanceNorm, self).__init__(**kwargs) | ||
self._kwargs = {'eps': epsilon} | ||
if in_channels != 0: | ||
self.in_channels = in_channels |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem needed in forward. The reference in __repr__
can be changed to use the shape from weight instead.
def __repr__(self): | ||
s = '{name}({content}' | ||
if hasattr(self, 'in_channels'): | ||
s += ', in_channels={0}'.format(self.in_channels) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
load size from weight instead.
python/mxnet/gluon/nn/conv_layers.py
Outdated
padding: int or a tuple of int | ||
An integer padding width for height and weight or a tuple of | ||
integers padding widths for each axis of the format ``(before_1, after_1, ... , | ||
before_N, after_N)``. The `padding` should be of length ``2*N`` where ``N`` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The document describes something general but the class is specific to 2D. Maybe describe the argument that's specific to 2D case here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If pass into a tuple, it works for different dimensions. Should we remove this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the class name says 2D, it should support 2D.
You can do something similar to the conv blocks, with a common abstract class for the forward logic, and the actual 1D/2D/3D classes for specific implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree. Just made the changes. Will refactor when adding 1D/3D
@@ -555,8 +553,9 @@ def hybrid_forward(self, F, x, gamma, beta): | |||
|
|||
def __repr__(self): | |||
s = '{name}({content}' | |||
in_channels = self.gamma.shape[0] | |||
if hasattr(self, 'in_channels'): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove condition
@zhanghang1989 @szha reflectionpad2d's input/output shape doc has wrong format |
* instance norm and reflection padding * r prefix * indent and space * fix docs * change docs * spacing * typo * hybrid forward * spcaing * add test for instance norm * fix typo * add to __all__ * rm white space * integer value * add test * make line short * rm white space * add docs ref * fix docs * RFpad2D docs * read shape from weight * rm condition
* instance norm and reflection padding * r prefix * indent and space * fix docs * change docs * spacing * typo * hybrid forward * spcaing * add test for instance norm * fix typo * add to __all__ * rm white space * integer value * add test * make line short * rm white space * add docs ref * fix docs * RFpad2D docs * read shape from weight * rm condition
Re-create the PR for instance norm and reflection padding.