Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GELU activation function #843

Merged
merged 5 commits into from
Feb 22, 2021
Merged

Conversation

ftbabi
Copy link
Contributor

@ftbabi ftbabi commented Feb 20, 2021

The VisionTransformer need GELU activation function.

@codecov
Copy link

codecov bot commented Feb 20, 2021

Codecov Report

Merging #843 (0e3c081) into master (ca47ae1) will decrease coverage by 0.00%.
The diff coverage is 62.50%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #843      +/-   ##
==========================================
- Coverage   62.97%   62.97%   -0.01%     
==========================================
  Files         145      145              
  Lines        8686     8693       +7     
  Branches     1571     1572       +1     
==========================================
+ Hits         5470     5474       +4     
- Misses       2951     2953       +2     
- Partials      265      266       +1     
Flag Coverage Δ
unittests 62.97% <62.50%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcv/cnn/bricks/activation.py 87.50% <62.50%> (-12.50%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ca47ae1...0e3c081. Read the comment docs.

from .registry import ACTIVATION_LAYERS

for module in [
nn.ReLU, nn.LeakyReLU, nn.PReLU, nn.RReLU, nn.ReLU6, nn.ELU,
nn.Sigmoid, nn.Tanh
nn.Sigmoid, nn.Tanh, F.gelu
if TORCH_VERSION == 'parrots' or TORCH_VERSION <= '1.3.1' else nn.GELU
]:
ACTIVATION_LAYERS.register_module(module=module)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest wrapping a GELU module for PyTorch 1.3.x since F.gelu cannot be called like nn.GELU.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it! Thanks a lot!

@hellock hellock merged commit 8735815 into open-mmlab:master Feb 22, 2021
@ftbabi ftbabi deleted the add_gelu_activation branch February 22, 2021 06:00
@hiyyg
Copy link
Contributor

hiyyg commented Feb 22, 2021

Also add SiLU?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants