layer_gru_cell {keras} | R Documentation |
Cell class for the GRU layer
Description
Cell class for the GRU layer
Usage
layer_gru_cell(
units,
activation = "tanh",
recurrent_activation = "sigmoid",
use_bias = TRUE,
kernel_initializer = "glorot_uniform",
recurrent_initializer = "orthogonal",
bias_initializer = "zeros",
kernel_regularizer = NULL,
recurrent_regularizer = NULL,
bias_regularizer = NULL,
kernel_constraint = NULL,
recurrent_constraint = NULL,
bias_constraint = NULL,
dropout = 0,
recurrent_dropout = 0,
reset_after = TRUE,
...
)
Arguments
units |
Positive integer, dimensionality of the output space. |
activation |
Activation function to use. Default: hyperbolic tangent
( |
recurrent_activation |
Activation function to use for the recurrent step.
Default: sigmoid ( |
use_bias |
Boolean, (default |
kernel_initializer |
Initializer for the |
recurrent_initializer |
Initializer for the |
bias_initializer |
Initializer for the bias vector. Default: |
kernel_regularizer |
Regularizer function applied to the |
recurrent_regularizer |
Regularizer function applied to the
|
bias_regularizer |
Regularizer function applied to the bias vector. Default:
|
kernel_constraint |
Constraint function applied to the |
recurrent_constraint |
Constraint function applied to the |
bias_constraint |
Constraint function applied to the bias vector. Default:
|
dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs. Default: 0. |
recurrent_dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state. Default: 0. |
reset_after |
GRU convention (whether to apply reset gate after or before matrix multiplication). FALSE = "before", TRUE = "after" (default and CuDNN compatible). |
... |
standard layer arguments. |
Details
See the Keras RNN API guide for details about the usage of RNN API.
This class processes one step within the whole time sequence input, whereas
tf.keras.layer.GRU
processes the whole sequence.
For example:
inputs <- k_random_uniform(c(32, 10, 8)) output <- inputs %>% layer_rnn(layer_gru_cell(4)) output$shape # TensorShape([32, 4]) rnn <- layer_rnn(cell = layer_gru_cell(4), return_sequence = TRUE, return_state = TRUE) c(whole_sequence_output, final_state) %<-% rnn(inputs) whole_sequence_output$shape # TensorShape([32, 10, 4]) final_state$shape # TensorShape([32, 4])
See Also
Other RNN cell layers:
layer_lstm_cell()
,
layer_simple_rnn_cell()
,
layer_stacked_rnn_cells()