train_iimi {iimi}R Documentation

train_iimi()

Description

Trains a XGBoost (default), ⁠Random Forest⁠, or ⁠Elastic Net⁠ model using user-provided data.

Usage

train_iimi(
  train_x,
  train_y,
  method = "xgb",
  nrounds = 100,
  max_depth = 10,
  gamma = 6,
  ntree = 100,
  k = 5,
  ...
)

Arguments

train_x

A data frame or a matrix of predictors.

train_y

A response vector of labels (needs to be a factor).

method

The machine learning method of choice, ⁠Random Forest⁠ or XGBoost, or ⁠Elastic Net⁠ model. Default is XGBoost model.

nrounds

Max number of boosting iterations for XGBoost model. Default is 100.

max_depth

Maximum depth of a tree in XGBoost model. Default is 10.

gamma

Minimum loss reduction required in XGBoost model. Default is 6.

ntree

Number of trees in ⁠Random Forest⁠ model. Default is 100.

k

Number of folds. Default is 5.

...

Other arguments that can be passed to randomForest, xgboost, or glmnet.

Value

A ⁠Random Forest⁠, XGBoost, ⁠Elastic Net⁠ model

Examples

## Not run: 
df <- convert_rle_to_df(example_cov)
train_x <- df[,-c(1:4)]
train_y = c()
for (ii in 1:nrow(df)) {
  seg_id = df$seg_id[ii]
  sample_id = df$sample_id[ii]
  train_y = c(train_y, example_diag[seg_id, sample_id])
}
trained_model <- train_iimi(train_x = train_x, train_y = train_y)

## End(Not run)





[Package iimi version 1.0.2 Index]