predict.classbagg {ipred} | R Documentation |
Predictions from Bagging Trees
Description
Predict the outcome of a new observation based on multiple trees.
Usage
## S3 method for class 'classbagg'
predict(object, newdata=NULL, type=c("class", "prob"),
aggregation=c("majority", "average", "weighted"), ...)
## S3 method for class 'regbagg'
predict(object, newdata=NULL, aggregation=c("average",
"weighted"), ...)
## S3 method for class 'survbagg'
predict(object, newdata=NULL,...)
Arguments
object |
object of classes |
newdata |
a data frame of new observations. |
type |
character string denoting the type of predicted value
returned for classification trees. Either |
aggregation |
character string specifying how to aggregate, see below. |
... |
additional arguments, currently not passed to any function. |
Details
There are (at least) three different ways to aggregate the predictions of
bagging classification trees. Most famous is class majority voting
(aggregation="majority"
) where the most frequent class is returned. The
second way is choosing the class with maximal averaged class probability
(aggregation="average"
). The third method is based on the "aggregated learning
sample", introduced by Hothorn et al. (2003) for survival trees.
The prediction of a new observation is the majority class, mean or
Kaplan-Meier curve of all observations from the learning sample
identified by the nbagg
leaves containing the new observation.
For regression trees, only averaged or weighted predictions are possible.
By default, the out-of-bag estimate is computed if newdata
is NOT
specified. Therefore, the predictions of predict(object)
are "honest"
in some way (this is not possible for combined models via comb
in
bagging
).
If you like to compute the predictions for the learning sample
itself, use newdata
to specify your data.
Value
The predicted class or estimated class probabilities are returned for classification trees. The predicted endpoint is returned in regression problems and the predicted Kaplan-Meier curve is returned for survival trees.
References
Leo Breiman (1996), Bagging Predictors. Machine Learning 24(2), 123–140.
Torsten Hothorn, Berthold Lausen, Axel Benner and Martin Radespiel-Troeger (2004), Bagging Survival Trees. Statistics in Medicine, 23(1), 77–91.
Examples
data("Ionosphere", package = "mlbench")
Ionosphere$V2 <- NULL # constant within groups
# nbagg = 10 for performance reasons here
mod <- bagging(Class ~ ., data=Ionosphere)
# out-of-bag estimate
mean(predict(mod) != Ionosphere$Class)
# predictions for the first 10 observations
predict(mod, newdata=Ionosphere[1:10,])
predict(mod, newdata=Ionosphere[1:10,], type="prob")