nn_selu {torch} | R Documentation |
SELU module
Description
Applied element-wise, as:
Usage
nn_selu(inplace = FALSE)
Arguments
inplace |
(bool, optional): can optionally do the operation in-place. Default: |
Details
\mbox{SELU}(x) = \mbox{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))
with \alpha = 1.6732632423543772848170429916717
and
\mbox{scale} = 1.0507009873554804934193349852946
.
More details can be found in the paper Self-Normalizing Neural Networks.
Shape
Input:
(N, *)
where*
means, any number of additional dimensionsOutput:
(N, *)
, same shape as the input
Examples
if (torch_is_installed()) {
m <- nn_selu()
input <- torch_randn(2)
output <- m(input)
}
[Package torch version 0.13.0 Index]