Skip to content

Commit

Permalink
Add documentation for nn.Hardsigmoid and nn.functional.hardsigmoid. (p…
Browse files Browse the repository at this point in the history
…ytorch#38120)

Summary: Pull Request resolved: pytorch#38120

Test Plan: build docs locally and attach a screenshot to this PR.

Differential Revision: D21477815

Pulled By: zou3519

fbshipit-source-id: 420bbcfcbd191d1a8e33cdf4a90c95bf00a5d226
  • Loading branch information
zou3519 authored and facebook-github-bot committed May 8, 2020
1 parent 4157211 commit 172bcdb
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 2 deletions.
6 changes: 6 additions & 0 deletions docs/source/nn.functional.rst
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,12 @@ Non-linear activation functions

.. autofunction:: sigmoid

:hidden:`hardsigmoid`
~~~~~~~~~~~~~~~~~

.. autofunction:: hardsigmoid


Normalization functions
-----------------------

Expand Down
1 change: 1 addition & 0 deletions docs/source/nn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,7 @@ Non-linear Activations (weighted sum, nonlinearity)

nn.ELU
nn.Hardshrink
nn.Hardsigmoid
nn.Hardtanh
nn.LeakyReLU
nn.LogSigmoid
Expand Down
2 changes: 1 addition & 1 deletion torch/nn/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -1617,7 +1617,7 @@ def hardsigmoid(input, inplace=False):
\text{Hardsigmoid}(x) = \begin{cases}
0 & \text{if~} x \le -3, \\
1 & \text{if~} x \ge +3, \\
x / 6 & \text{otherwise}
x / 6 + 1 / 2 & \text{otherwise}
\end{cases}
Args:
Expand Down
2 changes: 1 addition & 1 deletion torch/nn/modules/activation.py
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,7 @@ class Hardsigmoid(Module):
\text{Hardsigmoid}(x) = \begin{cases}
0 & \text{if~} x \le -3, \\
1 & \text{if~} x \ge +3, \\
x / 6 & \text{otherwise}
x / 6 + 1 / 2 & \text{otherwise}
\end{cases}
Expand Down

0 comments on commit 172bcdb

Please sign in to comment.