hyperTuning {dnn}R Documentation

A function for tuning of the hyper parameters

Description

{ hyperTuning} is a tuning tool to find the optimal hyper parameter for the ANN model.

Usage

   hyperTuning(x, y, model, ER = c("cindex", "mse"), 
          method = c('BuckleyJames', 'ipcw', 'transform', 'deepSurv'), 
          lower = NULL, upper = NULL, node = FALSE,
          K = 5, R = 25)
### additional function used in hyperTuning is cross-validation prediction error
#
#  CVpredErr(x, y, model, control, method)
#

Arguments

x

Covariates for the deep neural network model

y

Surv object for the deep neural network model

model

A deep neural network model, created by function dNNmodel().

ER

Prediction error measurement to be used in the cross vaditation, can be either a concordance index (cindex) or a mean square error (mse), default is cindex

method

Methods to handle censoring data in deep AFT model fit, 'BuckleyJames' for the Buckley and James method, 'ipcw' for the inverse probability censoring weights method. 'transform' for the transformation method based on book of Fan and Gijbels (1996, page 168). 'deepSurv' for the deepSurv model(Katzman, 2017)

node

Tuning the number of nodes in each hidden layer, default is FALSE

K

Number of folders of the cross-validatin, default is K = 5.

lower, upper

Bounds on the hyper parameters for the deep learning method. If NULL, then the default value for lower = dnnControl(alpha = 0.5, lambda = 1.0, lr_rate = 0.0001), upper = dnnControl(alpha = 0.97, lambda = 10, lr_rate = 0.001).

R

Number of random sample draw from the hyper parameter space, default is R = 25.

Details

A random search method is used to optimal hyper parameter (Bergstra and Bengio, 2012). The function { CVpredErr} will be call to calculate the cross-validation prediction error for the given x and y with the specified method from the input argument.

Value

A list of "model" and "dnnControl" is returned. The list contains at least the following components,

model

The "model" contains the optimal number of nodes for each hidden layer in the model specified by dNNmodel

control

The "control" contains the optimal tuning parameters with list components the same as those created by dnnControl

Author(s)

Chen, B. E. (chenbe@queensu.ca)

References

Bergstra, J. and Bengio, Y. (2012). Random search for hyper-parameter optimization. The Journal of Machine Learning Research. 13, page 281-305.

See Also

deepAFT, deepGLM, deepSurv, dnnFit

Examples

### Tuning the hyper parameter for a deepAFT model: 
#### cross-validation take a long time to run.

  set.seed(101)
### define model layers
  model = dNNmodel(units = c(4, 3, 1), activation = c("elu", "sigmoid", "sigmoid"), 
                   input_shape = 3)
  x = matrix(runif(45), nrow = 15, ncol = 3)
  time = exp(x[, 1])
  status = rbinom(15, 1, 0.5)
  y = Surv(time, status)
  ctl = dnnControl(epochs = 30)
  hyperTuning(x, y, model, method = "BuckleyJames", K = 2, R = 2, lower = ctl)


[Package dnn version 0.0.6 Index]