torch.nn.modules.activation
Members list
Type members
Classlikes
Applies the log(Softmax(x)) function to an n-dimensional input Tensor. The LogSoftmax formulation can be simplified as:
Applies the log(Softmax(x)) function to an n-dimensional input Tensor. The LogSoftmax formulation can be simplified as:
TODO LaTeX
Example:
import torch.*
val m = nn.LogSoftmax(dim = 1)
val input = torch.randn(Seq(2, 3))
val output = m(input)
Attributes
- Source
- LogSoftmax.scala
- Supertypes
Applies the rectified linear unit function element-wise:
Applies the rectified linear unit function element-wise:
$\text{ReLU}(x) = (x)^+ = \max(0, x)$
Attributes
- Source
- ReLU.scala
- Supertypes
Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1.
Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1.
Softmax is defined as: $$\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}$$
When the input Tensor is a sparse tensor then the unspecifed values are treated as -inf
.
Attributes
- Source
- Softmax.scala
- Supertypes
Applies the Hyperbolic Tangent (Tanh) function element-wise. Tanh is defined as::
Applies the Hyperbolic Tangent (Tanh) function element-wise. Tanh is defined as::
TODO LaTeX
Example:
import torch.*
val m = nn.Tanh()
val input = torch.randn(Seq(2))
val output = m(input)
Attributes
- Source
- Tanh.scala
- Supertypes