ts.lstm {TSLSTMplus}R Documentation

Long Short Term Memory (LSTM) Model for Time Series Forecasting

Description

The LSTM (Long Short-Term Memory) model is a Recurrent Neural Network (RNN) based architecture that is widely used for time series forecasting. Min-Max transformation has been used for data preparation. Here, we have used one LSTM layer as a simple LSTM model and a Dense layer is used as the output layer. Then, compile the model using the loss function, optimizer and metrics. This package is based on 'keras' and TensorFlow modules.

Usage

ts.lstm(
  ts,
  xreg = NULL,
  tsLag = NULL,
  xregLag = 0,
  LSTMUnits,
  DenseUnits = NULL,
  DropoutRate = 0,
  Epochs = 10,
  CompLoss = "mse",
  CompMetrics = "mae",
  Optimizer = optimizer_rmsprop,
  ScaleOutput = c(NULL, "scale", "minmax"),
  ScaleInput = c(NULL, "scale", "minmax"),
  BatchSize = 1,
  LSTMActivationFn = "tanh",
  LSTMRecurrentActivationFn = "sigmoid",
  DenseActivationFn = "relu",
  ValidationSplit = 0.1,
  verbose = 2,
  RandomState = NULL,
  EarlyStopping = callback_early_stopping(monitor = "val_loss", min_delta = 0, patience =
    3, verbose = 0, mode = "auto"),
  LagsAsSequences = TRUE,
  Stateful = FALSE,
  ...
)

Arguments

ts

Time series data

xreg

Exogenous variables

tsLag

Lag of time series data. If NULL, no lags of the output are used.

xregLag

Lag of exogenous variables

LSTMUnits

Number of unit in LSTM layers

DenseUnits

Number of unit in Extra Dense layers. A Dense layer with a single neuron is always added at the end.

DropoutRate

Dropout rate

Epochs

Number of epochs

CompLoss

Loss function

CompMetrics

Metrics

Optimizer

'keras' optimizer

ScaleOutput

Flag to indicate if ts shall be scaled before training

ScaleInput

Flag to indicate if xreg shall be scaled before training

BatchSize

Batch size to use during training

LSTMActivationFn

Activation function for LSTM layers

LSTMRecurrentActivationFn

Recurrent activation function for LSTM layers

DenseActivationFn

Activation function for Extra Dense layers

ValidationSplit

Validation split ration

verbose

Indicate how much information is given during training. Accepted values, 0, 1 or 2.

RandomState

seed for replication

EarlyStopping

EarlyStopping according to 'keras'

LagsAsSequences

Use lags as previous timesteps of features, otherwise use them as "extra" features.

Stateful

Flag to indicate if LSTM layers shall retain its state between batches.

...

Extra arguments passed to keras::layer_lstm

Value

LSTMmodel object

References

Paul, R.K. and Garai, S. (2021). Performance comparison of wavelets-based machine learning technique for forecasting agricultural commodity prices, Soft Computing, 25(20), 12857-12873

Examples


  if (keras::is_keras_available()){
      y<-rnorm(100,mean=100,sd=50)
      x1<-rnorm(100,mean=50,sd=50)
      x2<-rnorm(100, mean=50, sd=25)
      x<-cbind(x1,x2)
      TSLSTM<-ts.lstm(ts=y,
                      xreg = x,
                      tsLag=2,
                      xregLag = 0,
                      LSTMUnits=5,
                      ScaleInput = 'scale',
                      ScaleOutput = 'scale',
                      Epochs=2)
  }


[Package TSLSTMplus version 1.0.4 Index]