Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RELAY]prelu op support #2016

Merged
merged 2 commits into from
Oct 29, 2018
Merged

[RELAY]prelu op support #2016

merged 2 commits into from
Oct 29, 2018

Conversation

siju-samuel
Copy link
Member

#1799

Thanks for contributing to TVM! Please refer to guideline https://docs.tvm.ai/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers.

@siju-samuel
Copy link
Member Author

@srkreddy1238 @nishi-t @slyubomirsky @yuruofeifei @MarisaKirisame Could you please help in reviewing this PR. TIA

@slyubomirsky
Copy link
Contributor

LGTM, good test coverage

CHECK(ndim == 4)
<< "Input data should be 4D, but got " << ndim;

CHECK(reporter->Assert(param->axis < make_const(Int(64), ndim)))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should avoid the abusive use of symbolic Assert, specifically, since param->axis is int, which means we can directly check int relations. Assert is only needed if one of the input expression is symbolic

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be a common mistake pattern in recent PRs, can you add more comments to Assert to actually document the possible misuse cases and the intention of the API?

CHECK(param != nullptr);

size_t ndim = data->shape.size();
CHECK(ndim == 4)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This operator don't need to have this restriction

@tqchen tqchen merged commit e286e63 into apache:master Oct 29, 2018
@tqchen
Copy link
Member

tqchen commented Oct 29, 2018

Thanks @slyubomirsky @siju-samuel this is now merged

eqy pushed a commit to eqy/tvm that referenced this pull request Oct 29, 2018
@siju-samuel siju-samuel deleted the relay_prelu branch November 27, 2018 06:42
FrozenGene pushed a commit to FrozenGene/tvm that referenced this pull request Dec 27, 2018
wweic pushed a commit to neo-ai/tvm that referenced this pull request Feb 20, 2019
wweic pushed a commit to neo-ai/tvm that referenced this pull request Feb 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants