cv.mhingeova {bst} | R Documentation |
Cross-Validation for one-vs-all HingeBoost with multi-class problem
Description
Cross-validated estimation of the empirical misclassification error for boosting parameter selection.
Usage
cv.mhingeova(x, y, balance=FALSE, K=10, cost = NULL, nu=0.1,
learner=c("tree", "ls", "sm"), maxdepth=1, m1=200, twinboost = FALSE,
m2=200, trace=FALSE, plot.it = TRUE, se = TRUE, ...)
Arguments
x |
a data frame containing the variables in the model. |
y |
vector of multi class responses. |
balance |
logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts. |
K |
K-fold cross-validation |
cost |
price to pay for false positive, 0 < |
nu |
a small number (between 0 and 1) defining the step size or shrinkage parameter. |
learner |
a character specifying the component-wise base learner to be used:
|
maxdepth |
tree depth used in |
m1 |
number of boosting iteration |
twinboost |
logical: twin boosting? |
m2 |
number of twin boosting iteration |
trace |
if TRUE, iteration results printed out |
plot.it |
a logical value, to plot the estimated risks if |
se |
a logical value, to plot with standard errors. |
... |
additional arguments. |
Value
object with
residmat |
empirical risks in each cross-validation at boosting iterations |
fraction |
abscissa values at which CV curve should be computed. |
cv |
The CV curve at each value of fraction |
cv.error |
The standard error of the CV curve |
...
Note
The functions for balanced cross validation were from R package pmar.