RunjrSiCKLSNMF {jrSiCKLSNMF} | R Documentation |
Run jrSiCKLSNMF on an object of class SickleJr
Description
Wrapper function to run jrSiCKLSNMF on an object of class SickleJr. Performs jrSiCKLSNMF on the given SickleJr
Usage
RunjrSiCKLSNMF(
SickleJr,
rounds = 30000,
differr = 1e-06,
display_progress = TRUE,
lossonsubset = FALSE,
losssubsetsize = dim(SickleJr@H)[1],
minibatch = FALSE,
batchsize = 1000,
random_W_updates = FALSE,
seed = NULL,
minrounds = 200,
suppress_warnings = FALSE,
subsample = 1:dim(SickleJr@normalized.count.matrices[[1]])[2]
)
Arguments
SickleJr |
An object of class SickleJr |
rounds |
Number of rounds: defaults to 2000 |
differr |
Tolerance for percentage change in loss between updates: defaults to 1e-6 |
display_progress |
Boolean indicating whether to display the progress bar for jrSiCKLSNMF |
lossonsubset |
Boolean indicating whether to use a subset to calculate the loss function rather than the whole dataset |
losssubsetsize |
Size of the subset of data on which to calculate the loss |
minibatch |
Boolean indicating whether to use mini-batch updates |
batchsize |
Size of batch for mini-batch updates |
random_W_updates |
Boolean indicating whether or not to use random_W_updates updates
(i.e. only update |
seed |
Number specifying desired random seed |
minrounds |
Minimum number of rounds: most helpful for the mini-batch algorithm |
suppress_warnings |
Boolean indicating whether to suppress warnings |
subsample |
A numeric used primarily when finding an appropriate number of latent factors: defaults to total number of cells |
Value
An object of class SickleJr with updated \mathbf{W}^v
matrices, updated \mathbf{H}
matrix, and a vector of values for
the loss function added to the Wlist
, H
, and loss
slots, respectively
References
Cai D, He X, Wu X, Han J (2008). “Non-negative matrix factorization on manifold.” Proceedings - IEEE International Conference on Data Mining, ICDM, 63–72. ISSN 15504786, doi:10.1109/ICDM.2008.57.
Greene D, Cunningham P (2009). “A matrix factorization approach for integrating multiple data views.” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5781 LNAI(PART 1), 423–438. ISSN 03029743, doi:10.1007/978-3-642-04180-8_45/COVER, https://link.springer.com/chapter/10.1007/978-3-642-04180-8_45.
Eddelbuettel D, François R (2011). “Rcpp: Seamless R and C++ Integration.” Journal of Statistical Software, 40(8), 1–18. doi:10.18637/jss.v040.i08.
Eddelbuettel D, Sanderson C (2014). “RcppArmadillo: Accelerating R with high-performance C++ linear algebra.” Computational Statistics and Data Analysis, 71, 1054–1063. http://dx.doi.org/10.1016/j.csda.2013.02.005.
Elyanow R, Dumitrascu B, Engelhardt BE, Raphael BJ (2020). “NetNMF-SC: Leveraging gene-gene interactions for imputation and dimensionality reduction in single-cell expression analysis.” Genome Research, 30(2), 195–204. ISSN 15495469, doi:10.1101/gr.251603.119, https://pubmed.ncbi.nlm.nih.gov/31992614/.
Le Roux J, Weniger F, Hershey JR (2015). “Sparse NMF: half-baked or well done?” Mitsubishi Electric Research Laboratories (MERL), Cambridge.
Lee DD, Seung HS (2000). “Algorithms for Non-negative Matrix Factorization.” In Leen T, Dietterich T, Tresp V (eds.), Advances in Neural Information Processing Systems, volume 13. https://proceedings.neurips.cc/paper/2000/file/f9d1152547c0bde01830b7e8bd60024c-Paper.pdf.
Liu J, Wang C, Gao J, Han J (2013). “Multi-view clustering via joint nonnegative matrix factorization.” Proceedings of the 2013 SIAM International Conference on Data Mining, 252–260. doi:10.1137/1.9781611972832.28.
Examples
SimSickleJrSmall<-RunjrSiCKLSNMF(SimSickleJrSmall,rounds=5)