-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GELU activation function #843
Conversation
Codecov Report
@@ Coverage Diff @@
## master #843 +/- ##
==========================================
- Coverage 62.97% 62.97% -0.01%
==========================================
Files 145 145
Lines 8686 8693 +7
Branches 1571 1572 +1
==========================================
+ Hits 5470 5474 +4
- Misses 2951 2953 +2
- Partials 265 266 +1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
mmcv/cnn/bricks/activation.py
Outdated
from .registry import ACTIVATION_LAYERS | ||
|
||
for module in [ | ||
nn.ReLU, nn.LeakyReLU, nn.PReLU, nn.RReLU, nn.ReLU6, nn.ELU, | ||
nn.Sigmoid, nn.Tanh | ||
nn.Sigmoid, nn.Tanh, F.gelu | ||
if TORCH_VERSION == 'parrots' or TORCH_VERSION <= '1.3.1' else nn.GELU | ||
]: | ||
ACTIVATION_LAYERS.register_module(module=module) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest wrapping a GELU module for PyTorch 1.3.x since F.gelu
cannot be called like nn.GELU
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it! Thanks a lot!
Also add |
The VisionTransformer need GELU activation function.