activation_hard_silu {keras3}R Documentation

Hard SiLU activation function, also known as Hard Swish.

Description

It is defined as:

It's a faster, piecewise linear approximation of the silu activation.

Usage

activation_hard_silu(x)

activation_hard_swish(x)

Arguments

x

Input tensor.

Value

A tensor, the result from applying the activation to the input tensor x.

Reference


[Package keras3 version 0.2.0 Index]