LASSO Kullback-Leibler divergence based regression {Compositional}R Documentation

LASSO Kullback-Leibler divergence based regression


LASSO Kullback-Leibler divergence based regression.


lasso.klcompreg(y, x, alpha = 1, lambda = NULL,
nlambda = 100, type = "grouped", xnew = NULL)



A numerical matrix with compositional data. Zero values are allowed.


A numerical matrix containing the predictor variables.


The elastic net mixing parameter, with 0 ≤q α ≤q 1. The penalty is defined as a weighted combination of the ridge and of the Lasso regression. When α=1 LASSO is applied, while α=0 yields the ridge regression.


This information is copied from the package glmnet. A user supplied lambda sequence. Typical usage is to have the program compute its own lambda sequence based on nlambda and lambda.min.ratio. Supplying a value of lambda overrides this. WARNING: use with care. Avoid supplying a single value for lambda (for predictions after CV use predict() instead). Supply instead a decreasing sequence of lambda values. glmnet relies on its warms starts for speed, and its often faster to fit a whole path than compute a single fit.


This information is copied from the package glmnet. The number of lambda values, default is 100.


This information is copied from the package glmnet.. If "grouped" then a grouped lasso penalty is used on the multinomial coefficients for a variable. This ensures they are all in our out together. The default in our case is "grouped".


If you have new data use it, otherwise leave it NULL.


The function uses the glmnet package to perform LASSO penalised regression. For more details see the function in that package.


A list including:


We decided to keep the same list that is returned by glmnet. So, see the function in that package for more information.


If you supply a matrix in the "xnew" argument this will return an array of many matrices with the fitted values, where each matrix corresponds to each value of λ.


Michail Tsagris and Abdulaziz Alenazi.

R implementation and documentation: Michail Tsagris and Abdulaziz Alenazi


Aitchison J. (1986). The statistical analysis of compositional data. Chapman & Hall.

Friedman, J., Hastie, T. and Tibshirani, R. (2010) Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, Vol. 33(1), 1-22.

See Also

lassocoef.plot, cv.lasso.klcompreg, kl.compreg, lasso.compreg, ols.compreg, alfa.pcr, alfa.knn.reg


y <- as.matrix(iris[, 1:4])
y <- y / rowSums(y)
x <- matrix( rnorm(150 * 30), ncol = 30 )
a <- lasso.klcompreg(y, x)

[Package Compositional version 5.2 Index]