lars {mlpack}R Documentation

LARS

Description

An implementation of Least Angle Regression (Stagewise/laSso), also known as LARS. This can train a LARS/LASSO/Elastic Net model and use that model or a pre-trained model to output regression predictions for a test set.

Usage

lars(
  input = NA,
  input_model = NA,
  lambda1 = NA,
  lambda2 = NA,
  no_intercept = FALSE,
  no_normalize = FALSE,
  responses = NA,
  test = NA,
  use_cholesky = FALSE,
  verbose = getOption("mlpack.verbose", FALSE)
)

Arguments

input

Matrix of covariates (X) (numeric matrix).

input_model

Trained LARS model to use (LARS).

lambda1

Regularization parameter for l1-norm penalty. Default value "0" (numeric).

lambda2

Regularization parameter for l2-norm penalty. Default value "0" (numeric).

no_intercept

Do not fit an intercept in the model. Default value "FALSE" (logical).

no_normalize

Do not normalize data to unit variance before modeling. Default value "FALSE" (logical).

responses

Matrix of responses/observations (y) (numeric matrix).

test

Matrix containing points to regress on (test points) (numeric matrix).

use_cholesky

Use Cholesky decomposition during computation rather than explicitly computing the full Gram matrix. Default value "FALSE" (logical).

verbose

Display informational messages and the full list of parameters and timers at the end of execution. Default value "getOption("mlpack.verbose", FALSE)" (logical).

Details

An implementation of LARS: Least Angle Regression (Stagewise/laSso). This is a stage-wise homotopy-based algorithm for L1-regularized linear regression (LASSO) and L1+L2-regularized linear regression (Elastic Net).

This program is able to train a LARS/LASSO/Elastic Net model or load a model from file, output regression predictions for a test set, and save the trained model to a file. The LARS algorithm is described in more detail below:

Let X be a matrix where each row is a point and each column is a dimension, and let y be a vector of targets.

The Elastic Net problem is to solve

min_beta 0.5 || X * beta - y ||_2^2 + lambda_1 ||beta||_1 + 0.5 lambda_2 ||beta||_2^2

If lambda1 > 0 and lambda2 = 0, the problem is the LASSO. If lambda1 > 0 and lambda2 > 0, the problem is the Elastic Net. If lambda1 = 0 and lambda2 > 0, the problem is ridge regression. If lambda1 = 0 and lambda2 = 0, the problem is unregularized linear regression.

For efficiency reasons, it is not recommended to use this algorithm with "lambda1" = 0. In that case, use the 'linear_regression' program, which implements both unregularized linear regression and ridge regression.

To train a LARS/LASSO/Elastic Net model, the "input" and "responses" parameters must be given. The "lambda1", "lambda2", and "use_cholesky" parameters control the training options. A trained model can be saved with the "output_model". If no training is desired at all, a model can be passed via the "input_model" parameter.

The program can also provide predictions for test data using either the trained model or the given input model. Test points can be specified with the "test" parameter. Predicted responses to the test points can be saved with the "output_predictions" output parameter.

Value

A list with several components:

output_model

Output LARS model (LARS).

output_predictions

If –test_file is specified, this file is where the predicted responses will be saved (numeric matrix).

Author(s)

mlpack developers

Examples

# For example, the following command trains a model on the data "data" and
# responses "responses" with lambda1 set to 0.4 and lambda2 set to 0 (so,
# LASSO is being solved), and then the model is saved to "lasso_model":

## Not run: 
output <- lars(input=data, responses=responses, lambda1=0.4, lambda2=0)
lasso_model <- output$output_model

## End(Not run)

# The following command uses the "lasso_model" to provide predicted responses
# for the data "test" and save those responses to "test_predictions": 

## Not run: 
output <- lars(input_model=lasso_model, test=test)
test_predictions <- output$output_predictions

## End(Not run)

[Package mlpack version 4.4.0 Index]