gic.ncpen {ncpen} | R Documentation |
gic.ncpen: compute the generalized information criterion (GIC) for the selection of lambda
Description
The function provides the selection of the regularization parameter lambda based on the GIC including AIC and BIC.
Usage
gic.ncpen(fit, weight = NULL, verbose = TRUE, ...)
Arguments
fit |
(ncpen object) fitted |
weight |
(numeric) the weight factor for various information criteria.
Default is BIC if |
verbose |
(logical) whether to plot the GIC curve. |
... |
other graphical parameters to |
Details
User can supply various weight
values (see references). For example,
weight=2
,
weight=log(n)
,
weight=log(log(p))log(n)
,
weight=log(log(n))log(p)
,
corresponds to AIC, BIC (fixed dimensional model), modified BIC (diverging dimensional model) and GIC (high dimensional model).
Value
The coefficients matrix
.
gic |
the GIC values. |
lambda |
the sequence of lambda values used to calculate GIC. |
opt.beta |
the optimal coefficients selected by GIC. |
opt.lambda |
the optimal lambda value. |
Author(s)
Dongshin Kim, Sunghoon Kwon, Sangin Lee
References
Wang, H., Li, R. and Tsai, C.L. (2007). Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika, 94(3), 553-568. Wang, H., Li, B. and Leng, C. (2009). Shrinkage tuning parameter selection with a diverging number of parameters. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 71(3), 671-683. Kim, Y., Kwon, S. and Choi, H. (2012). Consistent Model Selection Criteria on High Dimensions. Journal of Machine Learning Research, 13, 1037-1057. Fan, Y. and Tang, C.Y. (2013). Tuning parameter selection in high dimensional penalized likelihood. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75(3), 531-552. Lee, S., Kwon, S. and Kim, Y. (2016). A modified local quadratic approximation algorithm for penalized optimization problems. Computational Statistics and Data Analysis, 94, 275-286.
See Also
Examples
### linear regression with scad penalty
sam = sam.gen.ncpen(n=200,p=20,q=5,cf.min=0.5,cf.max=1,corr=0.5)
x.mat = sam$x.mat; y.vec = sam$y.vec
fit = ncpen(y.vec=y.vec,x.mat=x.mat)
gic.ncpen(fit,pch="*",type="b")
### multinomial regression with classo penalty
sam = sam.gen.ncpen(n=200,p=20,q=5,k=3,cf.min=0.5,cf.max=1,corr=0.5,family="multinomial")
x.mat = sam$x.mat; y.vec = sam$y.vec
fit = ncpen(y.vec=y.vec,x.mat=x.mat,family="multinomial",penalty="classo")
gic.ncpen(fit,pch="*",type="b")