layer_activation_gelu {tfaddons} | R Documentation |
Gaussian Error Linear Unit
Description
Gaussian Error Linear Unit
Usage
layer_activation_gelu(object, approximate = TRUE, ...)
Arguments
object |
Model or layer object |
approximate |
(bool) Whether to apply approximation |
... |
additional parameters to pass |
Details
A smoother version of ReLU generally used in the BERT or BERT architecture based models. Original paper: https://arxiv.org/abs/1606.08415
Value
A tensor
Note
Input shape: Arbitrary. Use the keyword argument 'input_shape' (tuple of integers, d oes not include the samples axis) when using this layer as the first layer in a model.
Output shape: Same shape as the input.
[Package tfaddons version 0.10.0 Index]