Skip to content
This repository has been archived by the owner on Jan 3, 2023. It is now read-only.

Reshape #77

Merged
merged 3 commits into from
Nov 21, 2017
Merged

Reshape #77

merged 3 commits into from
Nov 21, 2017

Conversation

Krovatkin
Copy link
Contributor

Description

Adding a Reshape peration

Checklist

Essentials

  • Passed code style checking (make lint)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage (We decided on no tests)

Changes

  • MxNet's Reshape mapped to NGraph's Reshape
  • A corner case of an empty shape was added

Copy link
Contributor

@mbrookhart mbrookhart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -176,6 +176,26 @@ void Emitter::CreateUnaryOps() {
auto oneeighty = makeConstant(node, "180");
return op_map_[node->inputs_[0]] * (pi / oneeighty);
};
ngraph_op_funcs_["reshape"] = [this](const NodePtr& node) {
auto child = op_map_[node->inputs_[0]];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small question. Does Ngraph support the negative shape values that MXNet does? Will that pose a problem? I ask this because typically ngraph return types are size_t which are unsigned. That would screw up the expected values for MXNet.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is a negative shape a reversal?

Copy link
Contributor Author

@Krovatkin Krovatkin Nov 17, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@diyessi MxNet runs an analysis called InferShape and the goal of this analysis (at least in case of Reshape) is to produce(normalize/canonicalize) shapes w/ positive dimensions. The inputs that a user supplies can be negative and they define a micro reshape language of sorts:

- ``-1`` infers the dimension of the output shape by using the remainder of the input dimensions
- ``-2`` copy all/remainder of the input dimensions to the output shape.
- ``-3`` use the product of two consecutive dimensions of the input shape as the output dimension.
- ``-4`` split one dimension of the input into two dimensions passed subsequent to -4 in shape (can contain -1).
If the argument `reverse` is set to 1, then the special values are inferred from right to left.

This micro reshape language is "parsed" by InferShape which sets the output shape for a particular reshape operation and this output shape is what we are using.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ransford2011

Does Ngraph support the negative shape values that MXNet does?

Ngraph doesn't support negative shapes.

Will that pose a problem?

That's a very good question. I suspect that after InferShape runs it normalizes all shapes into shapes w/ positive dimensions but @mbrookhart should be better suited to answer this question.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NumPy would probably make a negative shape reverse the traversal or something like that, so I think these are pretty MXNet-specific.

For a reshape, we need to know how indices in the pre-shape map to indices in the post-shape and vice versa. The XLA definition was simpler than what we did in Python, particularly in C++.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Shapes are positive definite after MXNet's InferShapes pass, but I can seem to find enforcement of that idea.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mbrookhart that's why I was suggesting to add an assertion to TShape_to_NShape. We put it in and move on w/ our lives. If this assertion turns out to be false we go and fix the individual cases that need to be fixed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, that's okay with me.

@@ -48,6 +48,18 @@ inline To convert_shapes(const Ti& inshape) {
return shape;
}

template <>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see that you have made this a template, but it's not really a template. Did you make this a template to avoid a compiler error sir?? 😆

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

convert_shapes is already using templates; specializing it for this particular pair <nnvm::TShape, ngraph::Shape> allows us to make sure that the other types won't get this extra error checking.

EXPECT_EQ(TShape_to_NShape(nnvm::TShape{1}), ngraph::Shape{1});
EXPECT_EQ(TShape_to_NShape(nnvm::TShape{2, 3, 4, 5, 6}),
(ngraph::Shape{2, 3, 4, 5, 6}));
EXPECT_THROW(TShape_to_NShape(nnvm::TShape{2, 3, 4, -1}), const char*);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!! Thanks for adding the test!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ransford2011 the more the merrier :-) C++ tests are pretty cheap.

inline ngraph::Shape convert_shapes(const nnvm::TShape& inshape) {
ngraph::Shape shape;
for (const auto& s : inshape) {
if (s < 0) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Way to go by illustrating Defensive Programming 💯 .

Copy link
Contributor

@ransford2011 ransford2011 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I think we should either make that function templated or make the code work without having to templatized that function.

@mbrookhart mbrookhart merged commit 428b8b9 into ngraph-integration-dev Nov 21, 2017
@mbrookhart mbrookhart deleted the reshape branch November 21, 2017 23:12
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants