Skip to content

[Relay] Conv2d grad #3636

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Aug 29, 2019
Merged

[Relay] Conv2d grad #3636

merged 3 commits into from
Aug 29, 2019

Conversation

vinx13
Copy link
Member

@vinx13 vinx13 commented Jul 27, 2019

No description provided.

@vinx13 vinx13 force-pushed the feature/conv_grad branch from fc66ed8 to d365466 Compare August 14, 2019 00:14
@vinx13 vinx13 marked this pull request as ready for review August 14, 2019 00:15
@vinx13 vinx13 force-pushed the feature/conv_grad branch 3 times, most recently from 5a0ced6 to 1b77b72 Compare August 14, 2019 00:22
@MarisaKirisame
Copy link
Contributor

How is the numerical gradient going?

@vinx13
Copy link
Member Author

vinx13 commented Aug 14, 2019

@MarisaKirisame I disabled numerical tests for now

@MarisaKirisame
Copy link
Contributor

dont stress about it. It might be an error with numerical test.

@vinx13 vinx13 force-pushed the feature/conv_grad branch 5 times, most recently from 788a2ad to e1f0bdc Compare August 14, 2019 17:47
@SWu
Copy link
Contributor

SWu commented Aug 14, 2019

I tried playing around with this PR but I'm getting:

  File "/usr/tvm/python/tvm/relay/op/_tensor_grad.py", line 257, in conv2d_grad
    data_shape = get_const_tuple(data.checked_type.shape)
  File "/usr/tvm/python/tvm/relay/expr.py", line 47, in checked_type
    raise ValueError("The type checker has not populated"
ValueError: The type checker has not populated the checked_type for this node

Is there something I need to call to populate checked_type?

@vinx13
Copy link
Member Author

vinx13 commented Aug 14, 2019

@SWu You need to call InferType before calling gradient pass

@vinx13 vinx13 requested review from junrushao and masahi August 14, 2019 19:50
return f(e);
auto ret = f(e);
ret->checked_type_ = t;
return ret;
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MarisaKirisame please review this change

@SWu
Copy link
Contributor

SWu commented Aug 14, 2019

@vinx13 I am calling InferType pass on the function before gradient pass, but it doesn't seem to make a difference (for what it's worth, I'm using mode="first_order" for gradient pass since it seems there are still missing pieces in the relay VM executor for running functions generated by higher order gradient pass). If there's something non-obvious happening here we can discuss elsewhere; I was just curious since it doesn't seem like any other gradients so far require statically knowing the data shapes before calling gradient pass.

@vinx13
Copy link
Member Author

vinx13 commented Aug 14, 2019

@SWu I have updated the gradient pass here, that will copy the checked type before calling the registered gradient function. Have you tried the test case here? If your code is different from the test case, can you share your script and I can take a look?

@SWu
Copy link
Contributor

SWu commented Aug 14, 2019

@vinx13 I think I narrowed it down mode="first_order" when calling relay.transform.gradient(). It seems to work otherwise (using debug executor). I'm not sure if that's a bug in the first_order (as opposed to higher order) gradient transform or somewhere else though.

@vinx13
Copy link
Member Author

vinx13 commented Aug 14, 2019

@MarisaKirisame
Copy link
Contributor

@vinx13 It is indeed the best way. I am pretty ok with it. @jroesch is this ok with you?

@vinx13
Copy link
Member Author

vinx13 commented Aug 15, 2019

@MarisaKirisame The problem is we don't have access to the original call in this function (this is VisitExpr_(const OpNode*)

@MarisaKirisame
Copy link
Contributor

@vinx13 it is easily solved by requiring the lambda return to take an extra argument. type.
Another thing is the first_order mode shouldnt be used - it support less feature then the higher order mode (even in first order case!) For example, it dont has tuple support as operator call/return/whole function return. I am thinking of deprecating it.

@junrushao
Copy link
Member

If the supporting infra for non first order is not quite ready yet, let's keep first order for now.

@vinx13 vinx13 force-pushed the feature/conv_grad branch 2 times, most recently from 9229983 to 6833eca Compare August 15, 2019 06:40
@junrushao
Copy link
Member

Also CC @zhiics if you have time

@zhiics
Copy link
Member

zhiics commented Aug 15, 2019

@junrushao1994 Sure. Will do it today.

Copy link
Member

@zhiics zhiics left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if assigning the type directly is the best way because my impression is that the type should be retained somehow.

Otherwise, LGTM.

Copy link
Member

@junrushao junrushao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't object to assigning checked_type directly if there is no other way to do it.

@MarisaKirisame
Copy link
Contributor

@jroesch I want your input with changing the type directly.

@MarisaKirisame
Copy link
Contributor

@junrushao1994 @vinx13 can you guys wait a bit before getting this merged? I am a bit hesitant about writing the type directly and want to think about it a bit.

@junrushao
Copy link
Member

@MarisaKirisame not a hurry, we are hesitant as well

@junrushao
Copy link
Member

Is there any update?

@MarisaKirisame
Copy link
Contributor

@junrushao1994 I will talk with jared today.

@MarisaKirisame
Copy link
Contributor

@junrushao1994 it is ok.

@zhiics
Copy link
Member

zhiics commented Aug 28, 2019

Okay, I think we can now merge. @vinx13 Should we rebase to retrigger the CI since this PR has been here for a while?

@vinx13 vinx13 force-pushed the feature/conv_grad branch from 6833eca to 86c1805 Compare August 29, 2019 00:17
@vinx13 vinx13 force-pushed the feature/conv_grad branch from 86c1805 to 1c3bd84 Compare August 29, 2019 02:46
@vinx13 vinx13 merged commit d201978 into apache:master Aug 29, 2019
wweic pushed a commit to wweic/tvm that referenced this pull request Sep 16, 2019
* [Relay] Conv2d grad

* Fix test

* Fix first order gradient
wweic pushed a commit to wweic/tvm that referenced this pull request Sep 16, 2019
* [Relay] Conv2d grad

* Fix test

* Fix first order gradient
wweic pushed a commit to neo-ai/tvm that referenced this pull request Sep 16, 2019
* [Relay] Conv2d grad

* Fix test

* Fix first order gradient
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants