act_method {LearnSL}R Documentation

Activation Function

Description

Upon a received input, calculates the output based on the selected activation function

Usage

act_method(method, x)

Arguments

method

Activation function to be used. It must be one of "step", "sine", "tangent", "linear", "relu", "gelu" or "swish".

x

Input value to be used in the activation function.

Details

Formulae used:

step

f(x) = \begin{cases} 0 & \text{if } x < \text{threshold} \\ 1 & \text{if } x \geq \text{threshold} \end{cases}

sine

f(x) = \sinh(x)

tangent

f(x) = \tanh(x)

linear

x

relu

f(x) = \begin{cases} x & \text{if } x > 0 \\ 0 & \text{if } x \leq 0 \end{cases}

gelu

f(x) = \frac{1}{2} \cdot x \cdot \left(1 + \tanh\left(\sqrt{\frac{2}{\pi}} \cdot (x + 0.044715 \cdot x^3)\right)\right)

swish

f(x) = \frac{x}{1 + e^{-x}}

Value

List with the weights of the inputs.

Author(s)

VĂ­ctor Amador Padilla, victor.amador@edu.uah.es

Examples

# example code
act_method("step", 0.3)
act_method("gelu", 0.7)


[Package LearnSL version 1.0.0 Index]