-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add switch order layer for FCN model #2788
Add switch order layer for FCN model #2788
Conversation
1. Add switch function for switching image dimensions order 2. Add CpuMatrix::backwardSoftmax function 3. Add pixel softmax layer, python wrapper and grad_test
… pixel_softmax_layer
paddle/function/SwitchOp.cpp
Outdated
* how many zeros to add before and after the input in channel | ||
* dimension. And the heightStart and heightEnd indicate padding | ||
* in height dimension. The widthStart and widthEnd indicate the | ||
* padding in width dimension. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The above comments are not correct for NCHW2NHWCFunc
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thx.
paddle/function/SwitchOp.cpp
Outdated
size_t inW = inputs[0].shape()[3]; | ||
typename Tensor<real, Device>::Vector vec(outputs[0].shape().getElements(), | ||
outputs[0].data<real>()); | ||
vec.zero(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no need to set outputs
to zero, since the assignment (=) is used in NCHW2NHWC
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thx.
paddle/function/SwitchOp.cpp
Outdated
* Argument in this Function: | ||
* \param pad_ The same meaning as it in PadFunc. | ||
* \param inputs The gradient with respect to the output value of PadFunc. | ||
* \param outputs The gradient with respect to the input value of PadFunc. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The above comments are not correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thx.
Layer::forward(passType); | ||
MatrixPtr input = inputLayers_[0]->getOutputValue(); | ||
size_t batchSize = input->getHeight(); | ||
// cout<<"useGpu:"<<useGpu(deviceId_)<<endl; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thx.
inW_ = img_conf.img_size(); | ||
inC_ = img_conf.channels(); | ||
createFunction(forward_, "NCHW2NHWC", FuncConfig()); | ||
createFunction(backward_, "NHWC2NCHW", FuncConfig()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe, the name forward_
and backward_
is not quite good. nchw2nhwc_
and nhwc2nchw_
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't rename forward_ and backward_ directly . Because they are defined in Layer.h. So i added std::vector<std::shared_ptr<FunctionBase>> nchw2nhwc_;
and std::vector<std::shared_ptr<FunctionBase>> nhwc2nchw_;
into PixelSoftmaxLayer.h to fix this problem. Thx for your suggestion.
@wrap_name_default('pixel_softmax') | ||
def pixel_softmax_layer(input, name=None, layer_attr=None): | ||
""" | ||
This layer calculate softmax in image channel dimension |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need more detailed comments.
:param name: Name of this layer. | ||
:type name: basestring | ||
:param input: The input layer. | ||
:type input: LayerOutput |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's better to put input in front of the name to keep the order same with input arguments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
if isinstance(input, LayerOutput): | ||
input = [input] | ||
elif isinstance(input, Projection): | ||
input = [input] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This layer can not support Projection
. Only the mixed_layer
can use the Projection
as input.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
elif isinstance(input, Projection): | ||
input = [input] | ||
else: | ||
assert isinstance(input, collections.Sequence) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This only support one input layer. Lines 5902 to 5907 should be:
assert isinstance(input, LayerOutput)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
assert isinstance(input, collections.Sequence) | ||
l = Layer( | ||
name=name, | ||
inputs=[x.name for x in input], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inputs=input,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
2. Make SwitchOrderLayer support for softmax activation 3. Fix bugs
… pixel_softmax_layer
e35167a
to
ec236f4
Compare
… pixel_softmax_layer
… pixel_softmax_layer
what has_depth() means in SwitchOrderLayer(LayerBase) function, the function is in config_parser.py, i encounter an error: LayerConfig object has no attribute 'has_depth()', thx |
* add config for jde and deepsort, add engine metric tools * add mot_metrics * fix configs engine and others * fix configs others * move configs * fix ap_per_class, fix doc * fix test_mode, seqs * fix test_mode to metric * fix JDE arch metric * clean requirement * add to requirement
fix #2787