Skip to content

Commit 80a2087

Browse files
Added Softplus activation function (#9944)
1 parent c6ec99d commit 80a2087

File tree

1 file changed

+37
-0
lines changed

1 file changed

+37
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
"""
2+
Softplus Activation Function
3+
4+
Use Case: The Softplus function is a smooth approximation of the ReLU function.
5+
For more detailed information, you can refer to the following link:
6+
https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#Softplus
7+
"""
8+
9+
import numpy as np
10+
11+
12+
def softplus(vector: np.ndarray) -> np.ndarray:
13+
"""
14+
Implements the Softplus activation function.
15+
16+
Parameters:
17+
vector (np.ndarray): The input array for the Softplus activation.
18+
19+
Returns:
20+
np.ndarray: The input array after applying the Softplus activation.
21+
22+
Formula: f(x) = ln(1 + e^x)
23+
24+
Examples:
25+
>>> softplus(np.array([2.3, 0.6, -2, -3.8]))
26+
array([2.39554546, 1.03748795, 0.12692801, 0.02212422])
27+
28+
>>> softplus(np.array([-9.2, -0.3, 0.45, -4.56]))
29+
array([1.01034298e-04, 5.54355244e-01, 9.43248946e-01, 1.04077103e-02])
30+
"""
31+
return np.log(1 + np.exp(vector))
32+
33+
34+
if __name__ == "__main__":
35+
import doctest
36+
37+
doctest.testmod()

0 commit comments

Comments
 (0)