cqr.lasso.mm {cqrReg} | R Documentation |
Composite Quantile Regression (cqr) with Adaptive Lasso Penalty (lasso) use Majorize and Minimize (mm) Algorithm
Description
The adaptive lasso penalty parameter base on the estimated coefficient without penalty function. Composite quantile regression find the estimated coefficient which minimize the absolute error for various quantile level. The algorithm majorizing the objective function by a quadratic function followed by minimizing that quadratic.
Usage
cqr.lasso.mm(X,y,tau,lambda,beta,maxit,toler)
Arguments
X |
the design matrix |
y |
response variable |
tau |
vector of quantile level |
lambda |
The constant coefficient of penalty function. (default lambda=1) |
beta |
initial value of estimate coefficient (default naive guess by least square estimation) |
maxit |
maxim iteration (default 200) |
toler |
the tolerance critical for stop the algorithm (default 1e-3) |
Value
a list
structure is with components
beta |
the vector of estimated coefficient |
b |
intercept for various quantile level |
Note
cqr.lasso.mm(x,y,tau) work properly only if the least square estimation is good.
References
David R.Hunter and Runze Li.(2005) Variable Selection Using MM Algorithms,The Annals of Statistics 33, Number 4, Page 1617–1642.
Hui Zou and Ming Yuan(2008). Composite Quantile Regression and the Oracle Model Selection Theory, The Annals of Statistics, 36, Number 3, Page 1108–1126.
Examples
set.seed(1)
n=100
p=2
a=2*rnorm(n*2*p, mean = 1, sd =1)
x=matrix(a,n,2*p)
beta=2*rnorm(p,1,1)
beta=rbind(matrix(beta,p,1),matrix(0,p,1))
y=x%*%beta-matrix(rnorm(n,0.1,1),n,1)
tau=1:5/6
# x is 1000*20 matrix, y is 1000*1 vector, beta is 20*1 vector with last ten zero value elements.
cqr.lasso.mm(x,y,tau)