cv.rmbst {bst} | R Documentation |
Cross-Validation for Nonconvex Multi-class Loss Boosting
Description
Cross-validated estimation of the empirical multi-class loss, can be used for tuning parameter selection.
Usage
cv.rmbst(x, y, balance=FALSE, K = 10, cost = NULL, rfamily = c("thinge", "closs"),
learner = c("tree", "ls", "sm"), ctrl = bst_control(), type = c("loss","error"),
plot.it = TRUE, main = NULL, se = TRUE, n.cores=2, ...)
Arguments
x |
a data frame containing the variables in the model. |
y |
vector of responses. |
balance |
logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts. |
K |
K-fold cross-validation |
cost |
price to pay for false positive, 0 < |
rfamily |
|
Implementing the negative gradient corresponding to the loss function to be minimized.
learner |
a character specifying the component-wise base learner to be used:
|
ctrl |
an object of class |
type |
loss value or misclassification error. |
plot.it |
a logical value, to plot the estimated loss or error with cross validation if |
main |
title of plot |
se |
a logical value, to plot with standard errors. |
n.cores |
The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores. |
... |
additional arguments. |
Value
object with
residmat |
empirical risks in each cross-validation at boosting iterations |
fraction |
abscissa values at which CV curve should be computed. |
cv |
The CV curve at each value of fraction |
cv.error |
The standard error of the CV curve |
...
Author(s)
Zhu Wang