nn_rrelu {torch} | R Documentation |
RReLU module
Description
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Usage
nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
Arguments
lower |
lower bound of the uniform distribution. Default: |
upper |
upper bound of the uniform distribution. Default: |
inplace |
can optionally do the operation in-place. Default: |
Details
Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
\mbox{RReLU}(x) =
\left\{ \begin{array}{ll}
x & \mbox{if } x \geq 0 \\
ax & \mbox{ otherwise }
\end{array}
\right.
where a
is randomly sampled from uniform distribution
\mathcal{U}(\mbox{lower}, \mbox{upper})
.
See: https://arxiv.org/pdf/1505.00853.pdf
Shape
Input:
(N, *)
where*
means, any number of additional dimensionsOutput:
(N, *)
, same shape as the input
Examples
if (torch_is_installed()) {
m <- nn_rrelu(0.1, 0.3)
input <- torch_randn(2)
m(input)
}
[Package torch version 0.13.0 Index]