activation_rrelu {tfaddons} | R Documentation |
Rrelu
Description
rrelu function.
Usage
activation_rrelu(
x,
lower = 0.125,
upper = 0.333333333333333,
training = NULL,
seed = NULL
)
Arguments
x |
A 'Tensor'. Must be one of the following types: 'float16', 'float32', 'float64'. |
lower |
'float', lower bound for random alpha. |
upper |
'float', upper bound for random alpha. |
training |
'bool', indicating whether the 'call' is meant for training or inference. |
seed |
'int', this sets the operation-level seed. Returns: |
Details
Computes rrelu function: 'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled. See [Empirical Evaluation of Rectified Activations in Convolutional Network](https://arxiv.org/abs/1505.00853).
Value
A 'Tensor'. Has the same type as 'x'.
Computes rrelu function
'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled.
[Package tfaddons version 0.10.0 Index]