jeek-package {jeek} | R Documentation |
A Fast and Scalable Joint Estimator for Integrating Additional Knowledge in Learning Multiple Related Sparse Gaussian Graphical Models
Description
This is an R implementation of a Fast and Scalable Joint Estimator for Integrating Additional Knowledge in Learning Multiple Related Sparse Gaussian Graphical Models (JEEK).The JEEK algorithm can be used to fast estimate multiple related precision matrices in a large-scale. For instance, it can identify multiple gene networks from multi-context gene expression datasets. By performing data-driven network inference from high-dimensional and heterogenous data sets, this tool can help users effectively translate aggregated data into knowledge that take the form of graphs among entities. Please run demo(jeek) to learn the basic functions provided by this package. For further details, please read the original paper: Beilun Wang, Arshdeep Sekhon, Yanjun Qi (2018).
Details
Package: | jeek |
Type: | Package |
Version: | 1.1.0 |
Date: | 2018-07-03 |
License: | GPL (>= 2) |
We consider
the problem of including additional knowledge in estimating sparse Gaussian
graphical models (sGGMs) from aggregated samples, arising often in
bioinformatics and neuroimaging applications. Previous joint sGGM estimators
either fail to use existing knowledge or cannot scale-up to many tasks
(large K
) under a high-dimensional (large p
) situation. In this
paper, we propose a novel Joint Elementary Estimator incorporating
additional Knowledge (JEEK) to infer multiple related sparse Gaussian
Graphical models from large-scale heterogeneous data. Using domain knowledge
as weights, we design a novel hybrid norm as the minimization objective to
enforce the superposition of two weighted sparsity constraints, one on the
shared interactions and the other on the task-specific structural patterns.
This enables JEEK to elegantly consider various forms of existing knowledge
based on the domain at hand and avoid the need to design knowledge-specific
optimization. JEEK is solved through a fast and entry-wise parallelizable
solution that largely improves the computational efficiency of the
state-of-the-art O(p^5K^4)
to O(p^2K^4)
. We conduct a rigorous
statistical analysis showing that JEEK achieves the same convergence rate
O(\log(Kp)/n_{tot})
as the state-of-the-art estimators that are much
harder to compute. Empirically, on multiple synthetic datasets and one
real-world data from neuroscience, JEEK outperforms the speed of the
state-of-arts significantly while achieving the same level of prediction
accuracy.
Author(s)
Beilun Wang, Zhaoyang Wang
Maintainer: Beilun Wang - bw4mw at virginia dot edu
References
Beilun Wang, Arshdeep Sekhon, Yanjun Qi. A Fast and Scalable Joint Estimator for Integrating Additional Knowledge in Learning Multiple Related Sparse Gaussian Graphical Models. <arXiv:1806.00548>
Examples
## Not run:
data(exampleData)
result = jeek(X = exampleData, 0.3, covType = "cov", parallel = TRUE)
plot.jeek(results)
## End(Not run)