Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix prelu bug in pytorch frontend #8192

Merged
merged 6 commits into from
Jun 4, 2021
Merged

Fix prelu bug in pytorch frontend #8192

merged 6 commits into from
Jun 4, 2021

Conversation

YuhengHuang42
Copy link
Contributor

This PR tries to fix PReLU bug mentioned in #8184
Reference: https://pytorch.org/docs/stable/generated/torch.nn.PReLU.html#torch.nn.PReLU

In short, there are two situations we need to consider:

  1. When input dims < 2. PReLU in Pytorch can deal with this, while in TVM the default axis of tvm.relay.nn.prelu is set to 1, we will encounter an error when directly calling this function.
  2. When input dims >= 2, input channel > 1 and num_parameters = 1. In this case will need to do a broadcasting on the alpha parameter.

@masahi Please take a look at this PR, thanks!

@masahi
Copy link
Member

masahi commented Jun 4, 2021

Thanks @YuhengHuang42! Please fix the lint error by running black on test_forward.py.

@masahi masahi linked an issue Jun 4, 2021 that may be closed by this pull request
@masahi masahi self-assigned this Jun 4, 2021
@masahi
Copy link
Member

masahi commented Jun 4, 2021

Can you remove unrelated diffs in test_forward.py? My black version is black, version 20.8b1, running it on test_forward.py doesn't make any modification.

@YuhengHuang42
Copy link
Contributor Author

YuhengHuang42 commented Jun 4, 2021

Can you remove unrelated diffs in test_forward.py? My black version is black, version 20.8b1, running it on test_forward.py doesn't make any modification.

The latest version should be OK.

I encountered some weird bugs when using black, so I tried to use different versions of black to see the result. Somehow I changed the code style and because of this too many modifications are made. So I did rollback. I think I have fixed this problem.

I will wait to see if this version can be successfully built or not.

@masahi masahi merged commit 82cf197 into apache:main Jun 4, 2021
@masahi
Copy link
Member

masahi commented Jun 4, 2021

thanks @YuhengHuang42

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Jun 17, 2021
* Fix prelu bug in pytorch frontend

* Fix lint error

* fix lint error

* Fix lint error

* Try to fix lint error

* Fix lint error

Co-authored-by: huangyuheng <32429436+hyhzxhy@users.noreply.github.com>
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Jun 17, 2021
* Fix prelu bug in pytorch frontend

* Fix lint error

* fix lint error

* Fix lint error

* Try to fix lint error

* Fix lint error

Co-authored-by: huangyuheng <32429436+hyhzxhy@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Relay][Frontend][Pytorch] Prelu definition mismatch in pytorch
2 participants