-
-
Notifications
You must be signed in to change notification settings - Fork 46.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Scaled Exponential Linear Unit Activation Function #9027
Added Scaled Exponential Linear Unit Activation Function #9027
Conversation
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
@tianyizheng02 Thanks for your reply in the discussion section!. All checks passed now. |
|
||
|
||
def scaled_exponential_linear_unit( | ||
vector: np.ndarray, alpha: float = 1.6732, _lambda: float = 1.0507 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason why _lambda
has an underscore? Is the user not meant to change this coefficient?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, firstly lambda is a reserved keyword in python. Secondly both alpha and lambda are fixed constants.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The user can change these values, that may yield slightly different behaviour from the function, but the default values given to them are alpha: float = 1.6732, _lambda: float = 1.0507
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, firstly lambda is a reserved keyword in python.
Oh yeah, duh 🤦
Could you rename the variable to something like lambda_
instead? Having an underscore at the start of a variable name generally signifies that the user isn't supposed to use it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah. Yes for sure, I'll get it done right away!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
…ms#9027) * Added Scaled Exponential Linear Unit Activation Function * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update scaled_exponential_linear_unit.py * Update scaled_exponential_linear_unit.py * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update scaled_exponential_linear_unit.py * Update scaled_exponential_linear_unit.py * Update scaled_exponential_linear_unit.py * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update scaled_exponential_linear_unit.py * Update scaled_exponential_linear_unit.py --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Added Scaled Exponential Linear Unit Activation (SELU) Function under TheAlgorithms/Python/neural_network/activation_functions. Description of SELU taken from reference link provided in the top comment section in scaled_exponential_linear_unit.py
Fixes #9010
Checklist: