mus {hdme}R Documentation

Matrix Uncertainty Selector

Description

Matrix Uncertainty Selector for linear regression.

Usage

mus(W, y, lambda = NULL, delta = NULL)

Arguments

W

Design matrix, measured with error. Must be a numeric matrix.

y

Vector of responses.

lambda

Regularization parameter.

delta

Additional regularization parameter, bounding the measurement error.

Details

This function is just a wrapper for gmus(W, y, lambda, delta, family = "gaussian").

Value

An object of class "gmus".

References

Rosenbaum M, Tsybakov AB (2010). “Sparse recovery under matrix uncertainty.” Ann. Statist., 38(5), 2620–2651.

Sorensen O, Hellton KH, Frigessi A, Thoresen M (2018). “Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error.” Journal of Computational and Graphical Statistics, 27(4), 739-749. doi:10.1080/10618600.2018.1425626, https://doi.org/10.1080/10618600.2018.1425626.

Examples

# Example with Gaussian response
set.seed(1)
# Number of samples
n <- 100
# Number of covariates
p <- 50
# True (latent) variables
X <- matrix(rnorm(n * p), nrow = n)
# Measurement matrix (this is the one we observe)
W <- X + matrix(rnorm(n*p, sd = 1), nrow = n, ncol = p)
# Coefficient vector
beta <- c(seq(from = 0.1, to = 1, length.out = 5), rep(0, p-5))
# Response
y <- X %*% beta + rnorm(n, sd = 1)
# Run the MU Selector
fit1 <- mus(W, y)
# Draw an elbow plot to select delta
plot(fit1)
coef(fit1)

# Now, according to the "elbow rule", choose the final delta where the curve has an "elbow".
# In this case, the elbow is at about delta = 0.08, so we use this to compute the final estimate:
fit2 <- mus(W, y, delta = 0.08)
plot(fit2) # Plot the coefficients
coef(fit2)
coef(fit2, all = TRUE)


[Package hdme version 0.6.0 Index]