msgps {msgps} | R Documentation |
msgps (Degrees of Freedom of Elastic Net, Adaptive Lasso and Generalized Elastic Net)
Description
This package computes the degrees of freedom of the lasso, elastic net, generalized elastic net and adaptive lasso based on the generalized path seeking algorithm. The optimal model can be selected by model selection criteria including Mallows' Cp, bias-corrected AIC (AICc), generalized cross validation (GCV) and BIC.
Usage
msgps(X,y,penalty="enet", alpha=0, gamma=1, lambda=0.001, tau2, STEP=20000,
STEP.max=200000, DFtype="MODIFIED", p.max=300, intercept=TRUE, stand.coef=FALSE)
Arguments
X |
predictor matrix |
y |
response vector |
penalty |
The penalty term. The
Note that
The
where |
alpha |
The value of |
gamma |
The value of |
lambda |
The value of regularization parameter |
tau2 |
Estimator of error variance for Mallows' Cp. The default is the unbiased estimator of error vairance of the most complex model. When the unbiased estimator of error vairance of the most complex model is not available (e.g., the number of variables exceeds the number of samples), |
STEP |
The approximate number of steps. |
STEP.max |
The number of steps in this algorithm can often exceed |
DFtype |
|
p.max |
If the number of selected variables exceeds |
intercept |
When intercept is |
stand.coef |
When stand.coef is |
Author(s)
Kei Hirose
mail@keihirose.com
References
Friedman, J. (2008). Fast sparse regression and classification. Technical report
, Standford University.
Hirose, K., Tateishi, S. and Konishi, S.. (2011). Efficient algorithm to select tuning parameters in sparse regression modeling with regularization. arXiv:1109.2411 (arXiv).
See Also
coef.msgps
, plot.msgps
, predict.msgps
and summary.msgos
objects.
Examples
#data
X <- matrix(rnorm(100*8),100,8)
beta0 <- c(3,1.5,0,0,2,0,0,0)
epsilon <- rnorm(100,sd=3)
y <- X %*% beta0 + epsilon
y <- c(y)
#lasso
fit <- msgps(X,y)
summary(fit)
coef(fit) #extract coefficients at t selected by model selection criteria
coef(fit,c(0, 0.5, 2.5)) #extract coefficients at some values of t
predict(fit,X[1:10,]) #predict values at t selected by model selection criteria
predict(fit,X[1:10,],c(0, 0.5, 2.5)) #predict values at some values of t
plot(fit,criterion="cp") #plot the solution path with a model selected by Cp criterion
#elastic net
fit2 <- msgps(X,y,penalty="enet",alpha=0.5)
summary(fit2)
#generalized elastic net
fit3 <- msgps(X,y,penalty="genet",alpha=0.5)
summary(fit3)
#adaptive lasso
fit4 <- msgps(X,y,penalty="alasso",gamma=1,lambda=0)
summary(fit4)