PreEst.glasso {CovTools}R Documentation

Precision Matrix Estimation via Graphical Lasso

Description

Given a sample covariance matrix S, graphical lasso aims at estimating sparse precision matrix X - inverse of covariance. It solves a following optimization problem,

\textrm{max}_X \log\textrm{det}X - <S,X> - \lambda \|X \|_1 \textrm{ such that } X \succ 0

where \lambda a regularization parameter, <S,X>=tr(S^T X) , \|X\|_1 = \sum X_{ij} and X\succ 0 indicates positive definiteness. We provide three modes of computations, 'fixed','confidence', or 'BIC' with respect to \lambda. Please see the section below for more details.

Usage

PreEst.glasso(X, method = list(type = "fixed", param = 1), parallel = FALSE)

Arguments

X

an (n\times p) data matrix where each row is an observation.

method

a list containing following parameters,

type

one of 'fixed','confidence', or 'BIC'.

param

either a numeric value or vector of values.

parallel

a logical; TRUE for using half the cores available, FALSE otherwise.

Value

a named list containing:

C

a (p\times p) estimated precision matrix.

BIC

a dataframe containing \lambda values and corresponding BIC scores with type='BIC' method.

regularization parameters

We currently provide three options for solving the problem, 'fixed','confidence', or 'BIC' with respect to \lambda. When the method type is 'fixed', the parameter should be a single numeric value as a user-defined \lambda value. Likewise, method type of 'confidence' requires a singule numeric value in (0,1), where the value is set heuristically according to

\rho = \frac{t_{n-2}(\gamma) \max S_{ii}S_{jj}}{\sqrt{n-2+ t_{n-2}^2(\gamma)}}

for a given confidence level \gamma \in (0,1) as proposed by Banerjee et al. (2006). Finally, 'BIC' type requires a vector of \lambda values and opts for a lambda value with the lowest BIC values as proposed by Yuan and Lin (2007).

References

Banerjee O, Ghaoui LE, d'Aspremont A, Natsoulis G (2006). “Convex optimization techniques for fitting sparse Gaussian graphical models.” In Proceedings of the 23rd international conference on Machine learning, 89–96. ISBN 978-1-59593-383-6.

Yuan M, Lin Y (2007). “Model Selection and Estimation in the Gaussian Graphical Model.” Biometrika, 94(1), 19–35. ISSN 00063444.

Friedman J, Hastie T, Tibshirani R (2008). “Sparse inverse covariance estimation with the graphical lasso.” Biostatistics, 9(3), 432–441. ISSN 1465-4644, 1468-4357.

Examples


## generate data from multivariate normal with Identity precision.
pdim = 10
data = matrix(rnorm(100*pdim), ncol=pdim)

## prepare input arguments for diefferent scenarios
lbdvec <- c(0.01,0.1,1,10,100)              # a vector of regularization parameters
list1 <- list(type="fixed",param=1.0)       # single regularization parameter case
list2 <- list(type="confidence",param=0.95) # single confidence level case
list3 <- list(type="BIC",param=lbdvec)      # multiple regularizers with BIC selection

## compute with different scenarios
out1 <- PreEst.glasso(data, method=list1)
out2 <- PreEst.glasso(data, method=list2)
out3 <- PreEst.glasso(data, method=list3)

## visualize
opar <- par(no.readonly=TRUE)
par(mfrow=c(2,2), pty="s")
image(diag(pdim)[,pdim:1], main="Original Precision")
image(out1$C[,pdim:1],     main="glasso::lambda=1.0")
image(out2$C[,pdim:1],     main="glasso::Confidence=0.95")
image(out3$C[,pdim:1],     main="glasso::BIC selection")
par(opar)



[Package CovTools version 0.5.4 Index]