activation_relu {keras}R Documentation

Activation functions

Description

relu(...): Applies the rectified linear unit activation function.

elu(...): Exponential Linear Unit.

selu(...): Scaled Exponential Linear Unit (SELU).

hard_sigmoid(...): Hard sigmoid activation function.

linear(...): Linear activation function (pass-through).

sigmoid(...): Sigmoid activation function, sigmoid(x) = 1 / (1 + exp(-x)).

softmax(...): Softmax converts a vector of values to a probability distribution.

softplus(...): Softplus activation function, softplus(x) = log(exp(x) + 1).

softsign(...): Softsign activation function, softsign(x) = x / (abs(x) + 1).

tanh(...): Hyperbolic tangent activation function.

exponential(...): Exponential activation function.

gelu(...): Applies the Gaussian error linear unit (GELU) activation function.

swish(...): Swish activation function, swish(x) = x * sigmoid(x).

Usage

activation_relu(x, alpha = 0, max_value = NULL, threshold = 0)

activation_elu(x, alpha = 1)

activation_selu(x)

activation_hard_sigmoid(x)

activation_linear(x)

activation_sigmoid(x)

activation_softmax(x, axis = -1)

activation_softplus(x)

activation_softsign(x)

activation_tanh(x)

activation_exponential(x)

activation_gelu(x, approximate = FALSE)

activation_swish(x)

Arguments

x

Tensor

alpha

Alpha value

max_value

Max value

threshold

Threshold value for thresholded activation.

axis

Integer, axis along which the softmax normalization is applied

approximate

A bool, whether to enable approximation.

Details

Activations functions can either be used through layer_activation(), or through the activation argument supported by all forward layers.

Value

Tensor with the same shape and dtype as x.

References

See Also

https://www.tensorflow.org/api_docs/python/tf/keras/activations


[Package keras version 2.15.0 Index]