layer_activation_relu {keras} | R Documentation |
Rectified Linear Unit activation function
Description
Rectified Linear Unit activation function
Usage
layer_activation_relu(
object,
max_value = NULL,
negative_slope = 0,
threshold = 0,
input_shape = NULL,
batch_input_shape = NULL,
batch_size = NULL,
dtype = NULL,
name = NULL,
trainable = NULL,
weights = NULL
)
Arguments
object |
What to compose the new
|
max_value |
loat, the maximum output value. |
negative_slope |
float >= 0 Negative slope coefficient. |
threshold |
float. Threshold value for thresholded activation. |
input_shape |
Input shape (list of integers, does not include the samples axis) which is required when using this layer as the first layer in a model. |
batch_input_shape |
Shapes, including the batch size. For instance,
|
batch_size |
Fixed batch size for layer |
dtype |
The data type expected by the input, as a string ( |
name |
An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided. |
trainable |
Whether the layer weights will be updated during training. |
weights |
Initial weights for layer. |
See Also
Other activation layers:
layer_activation()
,
layer_activation_elu()
,
layer_activation_leaky_relu()
,
layer_activation_parametric_relu()
,
layer_activation_selu()
,
layer_activation_softmax()
,
layer_activation_thresholded_relu()