nn_rrelu {torch} | R Documentation |
RReLU module
Description
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Usage
nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
Arguments
lower |
lower bound of the uniform distribution. Default: |
upper |
upper bound of the uniform distribution. Default: |
inplace |
can optionally do the operation in-place. Default: |
Details
Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
where is randomly sampled from uniform distribution
.
See: https://arxiv.org/pdf/1505.00853.pdf
Shape
Input:
where
*
means, any number of additional dimensionsOutput:
, same shape as the input
Examples
if (torch_is_installed()) {
m <- nn_rrelu(0.1, 0.3)
input <- torch_randn(2)
m(input)
}
[Package torch version 0.13.0 Index]