Skip to content

Conversation

@Deivanayaki-S
Copy link
Contributor

This PR adds support for the relu6 operation in exported program and fx graph translator torch frontend. By adding this support, the following torchvision models are now able to run successfully:

  1. ofa_supernet_mbv3_w10
  2. ofa_supernet_mbv3_w12
  3. GhostNet

@Deivanayaki-S Deivanayaki-S marked this pull request as ready for review May 7, 2025 05:26
@Deivanayaki-S
Copy link
Contributor Author

@Hzfengsy Could you please review this PR?

nn.ReLU6: lambda node: self.block_builder.emit(
relax.op.clip(self.env[node.args[0]], 0, 6)
),
nn.ReLU6: self._unary_op(relax.op.nn.relu6),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder why we need relu6 op defined in relax? In other words, why we need to change the current implementation?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Hzfengsy Since the current clip mapping only handled the FX translator’s nn.ReLU6, it left other mappings (relu6.default, relu6_.default, and functional.relu6 in FX) unaddressed. Therefore, I added ReLU6 as a Relax op to unify the implementation.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for explaining it. My concern is that relu6 should not be a relax operator as it can be represented by other ops (i.e. clip).

We can keep the relax.op.nn.relu6 interface as a sugar, but not register a new relax op.
To be clear, it's better to remove the new definition in:

  • src/relax/op/nn/nn.cc
  • src/relax/op/nn/nn.h
  • src/contrib/msc/framework/tvm/relax_opcode.cc
  • python/tvm/topi/nn/elemwise.py
  • python/tvm/relax/transform/legalize_ops/

And change the codes in python/tvm/relax/op/nn/nn.py

return relax.op.clip(data, 0, 6)

@tqchen
Copy link
Member

tqchen commented May 10, 2025

agree with @Hzfengsy , we should simplify relax core op set, and not define relu6 as relax op

@Deivanayaki-S
Copy link
Contributor Author

@Hzfengsy @tqchen Thanks for pointing it out :), I've updated the code accordingly.

@Hzfengsy Hzfengsy merged commit 8a6c9bf into apache:main May 10, 2025
10 checks passed
ShiboXing pushed a commit to ShiboXing/tvm that referenced this pull request Aug 10, 2025
…ph (apache#17918)

* add relu6 op support into relax frontend

* fix lint ssues

* fix unity issue in test script

* fix issues in msc test script

* fix relu6 layout value in msc test script

* define relu6 op in relax_opcode file

* define relu6 op in torch codegen file

* update relu6 op implementation using clip op

* update test script

---------

Co-authored-by: deivanayakisankaralingam <deiva@Deivanayaki>
Co-authored-by: deivanayakisankaralingam <deiva@Deivanayaki.>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants