cv.rmbst {bst}R Documentation

Cross-Validation for Nonconvex Multi-class Loss Boosting

Description

Cross-validated estimation of the empirical multi-class loss, can be used for tuning parameter selection.

Usage

cv.rmbst(x, y, balance=FALSE, K = 10, cost = NULL, rfamily = c("thinge", "closs"), 
learner = c("tree", "ls", "sm"), ctrl = bst_control(), type = c("loss","error"), 
plot.it = TRUE, main = NULL, se = TRUE, n.cores=2, ...)

Arguments

x

a data frame containing the variables in the model.

y

vector of responses. y must be integers from 1 to C for C class problem.

balance

logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts.

K

K-fold cross-validation

cost

price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.

rfamily

rfamily = "thinge" for truncated multi-class hinge loss.

Implementing the negative gradient corresponding to the loss function to be minimized.

learner

a character specifying the component-wise base learner to be used: ls linear models, sm smoothing splines, tree regression trees.

ctrl

an object of class bst_control.

type

loss value or misclassification error.

plot.it

a logical value, to plot the estimated loss or error with cross validation if TRUE.

main

title of plot

se

a logical value, to plot with standard errors.

n.cores

The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores.

...

additional arguments.

Value

object with

residmat

empirical risks in each cross-validation at boosting iterations

fraction

abscissa values at which CV curve should be computed.

cv

The CV curve at each value of fraction

cv.error

The standard error of the CV curve

...

Author(s)

Zhu Wang

See Also

rmbst


[Package bst version 0.3-24 Index]