deep_learning/boosting_models_pytorch/weight_initialization_activation_functions/ #25
Replies: 1 comment
-
I would like to clarify two points in this section, it says that the tanh saturates at 0 and 1, and it has no signal to update weights. Is it really the case? the tanh has sign, it is its advantage in comparision to sigmoid. Also, its saturation is at -1 and 1 instead of 0 and 1, no? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
deep_learning/boosting_models_pytorch/weight_initialization_activation_functions/
We try to make learning deep learning, deep bayesian learning, and deep reinforcement learning math and code easier. Used by thousands.
https://www.deeplearningwizard.com/deep_learning/boosting_models_pytorch/weight_initialization_activation_functions/
Beta Was this translation helpful? Give feedback.
All reactions