PenalisedRegression {sharp}R Documentation

Penalised regression

Description

Runs penalised regression using implementation from glmnet. This function is not using stability.

Usage

PenalisedRegression(
  xdata,
  ydata,
  Lambda = NULL,
  family,
  penalisation = c("classic", "randomised", "adaptive"),
  gamma = NULL,
  ...
)

Arguments

xdata

matrix of predictors with observations as rows and variables as columns.

ydata

optional vector or matrix of outcome(s). If family is set to "binomial" or "multinomial", ydata can be a vector with character/numeric values or a factor.

Lambda

matrix of parameters controlling the level of sparsity.

family

type of regression model. This argument is defined as in glmnet. Possible values include "gaussian" (linear regression), "binomial" (logistic regression), "multinomial" (multinomial regression), and "cox" (survival analysis).

penalisation

type of penalisation to use. If penalisation="classic" (the default), penalised regression is done with the same regularisation parameter, or using penalty.factor, if specified. If penalisation="randomised", the regularisation for each of the variables is uniformly chosen between lambda and lambda/gamma. If penalisation="adaptive", the regularisation for each of the variables is weighted by 1/abs(beta)^gamma where beta is the regression coefficient obtained from unpenalised regression.

gamma

parameter for randomised or adaptive regularisation. Default is gamma=0.5 for randomised regularisation and gamma=2 for adaptive regularisation. The parameter gamma should be between 0 and 1 for randomised regularisation.

...

additional parameters passed to glmnet.

Value

A list with:

selected

matrix of binary selection status. Rows correspond to different model parameters. Columns correspond to predictors.

beta_full

array of model coefficients. Rows correspond to different model parameters. Columns correspond to predictors. Indices along the third dimension correspond to outcome variable(s).

References

Zou H (2006). “The adaptive lasso and its oracle properties.” Journal of the American statistical association, 101(476), 1418–1429.

Tibshirani R (1996). “Regression Shrinkage and Selection via the Lasso.” Journal of the Royal Statistical Society. Series B (Methodological), 58(1), 267–288. ISSN 00359246, http://www.jstor.org/stable/2346178.

See Also

SelectionAlgo, VariableSelection

Other underlying algorithm functions: CART(), ClusteringAlgo(), PenalisedGraphical(), PenalisedOpenMx()

Examples

# Data simulation
set.seed(1)
simul <- SimulateRegression(pk = 50)

# Running the LASSO
mylasso <- PenalisedRegression(
  xdata = simul$xdata, ydata = simul$ydata,
  Lambda = c(0.1, 0.2), family = "gaussian"
)

# Using glmnet arguments
mylasso <- PenalisedRegression(
  xdata = simul$xdata, ydata = simul$ydata,
  Lambda = c(0.1), family = "gaussian",
  penalty.factor = c(rep(0, 10), rep(1, 40))
)
mylasso$beta_full

[Package sharp version 1.4.6 Index]