singboost {gfboost} | R Documentation |
SingBoost Boosting method
Description
SingBoost is a Boosting method that can deal with complicated loss functions that do not allow for a gradient. SingBoost is based on L2-Boosting in its current implementation.
Usage
singboost(
D,
M = 10,
m_iter = 100,
kap = 0.1,
singfamily = Gaussian(),
best = 1,
LS = FALSE
)
Arguments
D |
Data matrix. Has to be an |
M |
An integer between 2 and |
m_iter |
Number of SingBoost iterations. Default is 100. |
kap |
Learning rate (step size). Must be a real number in |
singfamily |
A Boosting family corresponding to the target loss function. See . |
best |
Needed in the case of localized ranking. The parameter |
LS |
If a |
Details
Gradient Boosting algorithms require convexity and differentiability of the underlying loss function.
SingBoost is a Boosting algorithm based on L_2-
Boosting that allows for complicated loss functions that do not
need to satisfy these requirements. In fact, SingBoost alternates between standard L_2-
Boosting iterations and
singular iterations where essentially an empirical gradient step is executed in the sense that the baselearner
that performs best, evaluated in the complicated loss, is selected in the respective iteration. The implementation
is based on glmboost
from the package mboost
and using the L_2-
loss in the singular iterations returns exactly the
same coefficients as L_2-
Boosting.
Value
Selected variables |
Names of the selected variables. |
Coefficients |
The selected coefficients as an |
Freqs |
Selection frequencies and a matrix for intercept and coefficient paths, respectively. |
VarCoef |
Vector of the non-zero coefficients. |
References
Werner, T., Gradient-Free Gradient Boosting, PhD Thesis, Carl von Ossietzky University Oldenburg, 2020
P. Bühlmann and B. Yu. Boosting with the l2 loss: Regression and Classification. Journal of the American Statistical Association, 98(462):324–339, 2003
T. Hothorn, P. Bühlmann, T. Kneib, M. Schmid, and B. Hofner. mboost: Model-Based Boosting, 2017
Examples
{glmres<-glmboost(Sepal.Length~.,iris)
glmres
attributes(varimp(glmres))$self
attributes(varimp(glmres))$var
firis<-as.formula(Sepal.Length~.)
Xiris<-model.matrix(firis,iris)
Diris<-data.frame(Xiris[,-1],iris$Sepal.Length)
colnames(Diris)[6]<-"Y"
coef(glmboost(Xiris,iris$Sepal.Length))
singboost(Diris)
singboost(Diris,LS=TRUE)}
{glmres2<-glmboost(Sepal.Length~Petal.Length+Sepal.Width:Species,iris)
finter<-as.formula(Sepal.Length~Petal.Length+Sepal.Width:Species-1)
Xinter<-model.matrix(finter,iris)
Dinter<-data.frame(Xinter,iris$Sepal.Length)
singboost(Dinter)
coef(glmres2)}
{glmres3<-glmboost(Xiris,iris$Sepal.Length,control=boost_control(mstop=250,nu=0.05))
coef(glmres3)
attributes(varimp(glmres3))$self
singboost(Diris,m_iter=250,kap=0.05)
singboost(Diris,LS=TRUE,m_iter=250,kap=0.05)}
{glmquant<-glmboost(Sepal.Length~.,iris,family=QuantReg(tau=0.75))
coef(glmquant)
attributes(varimp(glmquant))$self
singboost(Diris,singfamily=QuantReg(tau=0.75),LS=TRUE)
singboost(Diris,singfamily=QuantReg(tau=0.75),LS=TRUE,M=2)}
{singboost(Diris,singfamily=Rank(),LS=TRUE)
singboost(Diris,singfamily=Rank(),LS=TRUE,M=2)}