Skip to content

Conversation

@Deivanayaki-S
Copy link
Contributor

This PR refactors the SELU activation function to be implemented at the core level using decomposed operations. It allows SELU to be accessed directly via R.nn.selu, making it more efficient and easier to use in TVM models.

@Deivanayaki-S Deivanayaki-S marked this pull request as ready for review April 2, 2025 11:08
@Hzfengsy Hzfengsy merged commit cad1c68 into apache:main Apr 3, 2025
11 checks passed
ShiboXing pushed a commit to ShiboXing/tvm that referenced this pull request Aug 10, 2025
…el Ops (apache#17797)

* Integrate SELU into core ops for native R.nn.selu support

* fix trailing whitespace issue

* fixing selu mapping issue in fx_graph and lint issue

* update the test script of selu in fx graph

* modify test script to fix selu module check

* format documentations

---------

Co-authored-by: deivanayakisankaralingam <deiva@Deivanayaki>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants