| eztune {EZtune} | R Documentation | 
Supervised Learning Function
Description
eztune is a function that automatically tunes adaboost, support
vector machines, gradient boosting machines, and elastic net. An
optimization algorithm is used to find a good set of tuning parameters
for the selected model. The function optimizes on a validation dataset,
cross validated accuracy, or resubstitution accuracy.
Usage
eztune(
  x,
  y,
  method = "svm",
  optimizer = "hjn",
  fast = TRUE,
  cross = NULL,
  loss = "default"
)
Arguments
| x | Matrix or data frame containing the dependent variables. | 
| y | Vector of responses. Can either be a factor or a numeric vector. | 
| method | Model to be fit. Choices are " | 
| optimizer | Optimization method. Options are " | 
| fast | Indicates if the function should use a subset of the
observations when optimizing to speed up calculation time. A value
of  | 
| cross | If an integer k \> 1 is specified, k-fold cross-validation
is used to fit the model. This method is very slow for large datasets.
This parameter is ignored unless  | 
| loss | The type of loss function used for optimization. Options
for models with a binary response are " | 
Value
Function returns an object of class "eztune" which contains
a summary of the tuning parameters for the best model, the best loss
measure achieved (classification accuracy, AUC, MSE, or MAE), and the best
model.
| loss | Best loss measure obtained by the optimizer. This is the measure specified by the user that the optimizer uses to choose a "best" model (classification accuracy, AUC, MSE, or MAE). Note that if the default option is used it is the classification accuracy for a binary response and the MSE for a continuous response. | 
| model | Best model found by the optimizer. Adaboost model
comes from package  | 
| n | Number of observations used in model training when fast option is used | 
| nfold | Number of folds used if cross validation is used for optimization. | 
| iter | Tuning parameter for adaboost. | 
| nu | Tuning parameter for adaboost. | 
| shrinkage | Tuning parameter for adaboost and gbm. | 
| lambda | Tuning parameter for elastic net | 
| alpha | Tuning parameter for elastic net | 
| n.trees | Tuning parameter for gbm. | 
| interaction.depth | Tuning parameter for gbm. | 
| n.minobsinnode | Tuning parameter for gbm. | 
| cost | Tuning parameter for svm. | 
| gamma | Tuning parameter for svm. | 
| epsilon | Tuning parameter for svm regression. | 
| levels | If the model has a binary response, the levels of y are listed. | 
Examples
library(mlbench)
data(Sonar)
sonar <- Sonar[sample(1:nrow(Sonar), 100), ]
y <- sonar[, 61]
x <- sonar[, 1:10]
# Optimize an SVM using the default fast setting and Hooke-Jeeves
eztune(x, y)
# Optimize an SVM with 3-fold cross validation and Hooke-Jeeves
eztune(x, y, fast = FALSE, cross = 3)
# Optimize GBM using training set of 50 observations and Hooke-Jeeves
eztune(x, y, method = "gbm", fast = 50, loss = "auc")
# Optimize SVM with 25% of the observations as a training dataset
# using a genetic algorithm
eztune(x, y, method = "svm", optimizer = "ga", fast = 0.25)