rstats2.lmridge {lmridge} | R Documentation |
Ordinary Ridge Regression Statistics 2
Description
The rstats2
function computes the ordinary ridge related statistics such as Ck
, \sigma^2
, ridge degrees of freedom, effective degrees of freedom (EDF), and prediction residual error sum of squares PRESS statistics for scalar or vector value of biasing parameter K
(See Allen, 1974 <doi:10.2307/1267500>; Lee, 1979; Hoerl and Kennard, 1970 <doi:10.2307/1267351>).
Usage
rstats2(x, ...)
## S3 method for class 'lmridge'
rstats2(x, ...)
## S3 method for class 'rstats2'
print(x, digits = max(5,getOption("digits") - 5), ...)
Arguments
x |
For the |
digits |
Minimum number of significant digits to be used. |
... |
Not presently used in this implementation. |
Details
The rstats2
function computes the ridge regression related different statistics which may help in selecting the optimal value of biasing parameter K
. If value of K
is zero then these statistics are equivalent to the relevant OLS statistics.
Value
Following are ridge related statistics computed for given scalar or vector value of biasing parameter K
provided as argument to lmridge
or lmridgeEst
function.
CK |
|
dfridge |
DF of ridge for given biasing parameter |
EP |
Effective number of Parameters for given biasing parameter |
redf |
Residual effective degrees of freedom for given biasing parameter |
EF |
Effectiveness index for given biasing parameter |
ISRM |
Quantification of concept of stable region proposed by Vinod and Ullah, 1981, i.e., |
m |
m-scale for given value of biasing parameter proposed by Vinod (1976) alternative to plotting of the ridge coefficients, i.e., |
PRESS |
PRESS statistics for ridge regression introduced by Allen, 1971, 1974, i.e., |
Author(s)
Muhammad Imdad Ullah, Muhammad Aslam
References
Allen, D. M. (1971). Mean Square Error of Prediction as a Criterion for Selecting Variables. Technometrics, 13, 469-475. doi:10.1080/00401706.1971.10488811.
Allen, D. M. (1974). The Relationship between Variable Selection and Data Augmentation and Method for Prediction. Technometrics, 16, 125-127. doi:10.1080/00401706.1974.10489157.
Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:1205.0686v1 [stat.AP].
Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. Chapman & Hall.
Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.
Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.
Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.
Kalivas, J. H., and Palmer, J. (2014). Characterizing Multivariate Calibration Tradeoffs (Bias, Variance, Selectivity, and Sensitivity) to Select Model Tuning Parameters. Journal of Chemometrics, 28(5), 347–357. doi:10.1002/cem.2555.
Lee, W. F. (1979). Model Estimation Using Ridge Regression with the Variane Normalization Criterion. Master thesis, Department of Educational Foundation Memorial University of Newfoundland.
See Also
Ridge related statistics rstats1
, ridge model fitting lmridge
Examples
data(Hald)
mod <- lmridge(y~., data=as.data.frame(Hald), K = seq(0,0.2, 0.001) )
rstats2(mod)